on Thursday, November 24, 2011
Address Is Approximate

A beautiful stop motion animation short produced, animated, filmed, lit, edited & graded by Tom Jenkins ( theoryfilms.co.uk )
on Wednesday, November 16, 2011
The design world at large is looking for non-linear collaboration it seems.  Parallel to PREVIS in filmmaking there is Building Information Modeling (BIM) in architecture.
Is BIM the holy grail of architectural collaboration? 

 I came across an interesting article from The Guardian on this subject (linked here)


The UK Government is poised to mandate BIM for all public buildings which will influence the way designers, architects and engineers collaborate on future projects. 


Nemetschek VectorWorks offered a webinar on a third party plug-in for their modeling software that allows virtual online collaboration in 3D that is very useful for early design phase. As a VectorWorks user, I am very interested to see how this will be applied to the VIRTUAL ART DEPARTMENT.


Register for the second part of this series: Using an openBIM Workflow (1 AIA LU Hour)
(see the link at the end of this post for Part One - video)


on Monday, November 14, 2011
With rapidly evolving advances of ‘real-time’ software and processing power, I wonder how long before digital ‘previs’ becomes just plain ‘vis’.
"Previs is a collaborative process that generates preliminary versions of shots or sequences, predominantly using 3D animation tools and a virtual environment. It enables filmmakers to visually explore creative ideas, plan technical solutions, and communicate a shared vision for efficient production."

Brian Pohl, Visual Effects  (VFX) artist and Previs Supervisor, says in his blog:
“…Technological integration has gone even further with a new process called virtual production. … the seamless unification between the virtual and the real…. Virtual production goes beyond previs' planning procedures and actually moves them out of preproduction and into principle photography itself’.

The technological integration Pohl cites above is in reference to James Cameron’s development of Virtual Production used on the movie Avatar, and is closer to “vis”.  Pohl, now with Autodesk's product design team, is a founding member of the Previs Society, a relatively new association of professional cinematic artists based in Los Angeles.
Previs artists are crossing the boundaries of the narrative design fields. In production of film, television, games, live events, architectural and industrial design, digital previs facilitates collaboration, and supports an immersive design process. The cross-pollination of disciplines is at the heart of 5D | Future of Immersive Design Conference and one reason why I joined the Previs Society.
Shared knowledge is key a part of the mission of this group imparted through online discussion and monthly events.
I attended a recent full day event, Previs Annual Forum 2011, held on October 22, featuring informative presentations and stimulating discussion with several top Previs Artists. A number of technology companies participated, demonstrating products as well as hosting lunch and providing chances to mingle during coffee breaks followed by a lively cocktail hour that evening.
 Activison Tour
In September, Activision invited the Previs Society for a ‘sneak peek’ of its new state-of-the-art performance capture studio. This opportunity was of special interest to me with regard to 5D, because here is a game company opening their facility to filmmakers. Activision gave us a tour of their large-volume motion-capture space and demonstrated use of their virtual camera system, which I found intriguing.
Activision uses industry-standard infrared camera capture technology with their custom-built wireless virtual camera system. The camera system has a small monitor for viewing a motion-capture performance in a virtual environment, similar to what is typically seen in the ‘making-of-Avatar’ videos.
Flashback
I had seen earlier versions of these types of camera systems before but hadn’t fully grasped the powerful pull of where technology is taking the design process.
In 2008, Chris Edwards, CEO of Third Floor Inc., a company that specializes in previs, was an early adopter of this technology and demonstrated a virtual reality camera made by InterSense, at 5D08 | Future of Immersive Design Conference.
Gamecaster GCS3 Virtual Camera
In 2010, Tino Schaedler, 5D Founding Member, invited a handful of colleagues to his studio for a demo of another virtual camera system that Schaedler thought might be an excellent previs tool for use within the Art Department. He introduced us to developer David Macintosh, CEO of San Diego based company Gamecaster, who presented the GCS3; a camera system that allows artists to explore 3D animated environments.  At the time I thought it was difficult to navigate, but with a little practice would allow a novice such as myself to explore and record shots, saving them as virtual walk/fly-throughs that could be shared with a director.

Several Previs Artists speaking on a Visual Effects Society panel following a screening of ‘Real Steel’ at Disney Studios demonstrated a more advanced version of the virtual camera system. The presentation touched on processes including motion-capture, image-based capture, photogrammetry, virtual camera and simul-cam. The artists raved about the great success of digital previs as a new hub of efficient production planning, while still staying within the parameters of a studio budget and timeframe. In the end the panelists called it ‘modern filmmaking.’
Previs I had been exposed to in the past consisted primarily of crude animatics with models and characters created in-house by the previs team typically to figure out complicated VFX shots or stunts. It became clear that these newer processes used by the ‘Real Steel’ panel are pushing previs into a virtual world based on real-world parameters, and on a much larger scale than I had seen before.  With this new understanding, I revisited Activision and met with Matt Karnes, Director of Motion Capture, and Sylvain Doreau, Cinematics Director, for another look at their ‘technological tour de force’ motion-capture facility, a facility that specializes in the capture of data that can be used to create previs - the key driver in modern filmmaking.
The Studio
Activision’s new facility is located in the heart of ‘mo-cap alley’, an area near Marina del Rey that is home to several VFX and premiere mo-cap facilities including House of Moves (HOM) and Giant Studios. Activision started doing in-house mo-cap 5 years ago, in studio space originally rented from HOM. It has since expanded to become one of the top 5 of its kind in the world.
30' x 60' Mo-Cap large volume performance capture space 
180º Facial performance capture & audio booth
The studio boasts a 30’ x 60’ (with 15’h grid) large volume performance-capture space with 62 Vicon cameras (used with traditional* dot-marker suits) that can easily accommodate 6-10 actors. They have a full-body 3D scanner and a separate 3D head scanner (Cyberware), Canon 5D & 4D cameras for high-resolution texture mapping, and a 180º facial performance-capture system that uses 18 Vicon cameras inside a sound booth for recording simultaneous high-quality audio with a capture session.

The Process
Actors in reflector-dot suits perform on the mo-cap stage within a preloaded virtual environment  provided by the game developer that has been mapped to the mo-cap stage. The actors are able to orient themselves in the virtual world by viewing large screen projection at one end of the space.  Simplified props and set pieces in real scale are provided for the cast to interact with, for a more natural performance.
Volume space with a ramp for actor performance
Once actor-performance data is captured the actors are finished. The performance data is retargeted onto digital character that is combined with a virtual environment for previs animatics to be made. Similar to live action film making, the Director of Photography and/or Director with a custom-built virtual camera and the previs as reference, is able to go back onto the mo-cap stage to start filming in real time – but now in a virtual production.
New camera ‘footage’, transmitted wirelessly to computer, is viewed simultaneously by the DP in the camera ‘viewfinder’ – a 10” hi-definition video screen mounted on the camera –and  by production/cast on a large viewing screen at one end of the volume space. All the rigs of a traditional camera set up such as dolly or steady-cam may be employed.
Mo-cap client work desks
Sylvain Doreau, Cinematics Director at Activision, explained how virtual camera data allows him to create and edit QuickTime movies in a computer on site as a crosscheck for coverage. Doreau also creates a previs for each shot, as a necessary part of his pre-planning, not only for the performance capture, but also for the DP- who is shooting the virtual ‘footage’ long after the actors are gone.
 According to Matt Karnes, Activision is transitioning to newer technology where capture of facial and full body performance will be integrated in the volume space. New helmet rigs worn by the actors will consist of 4 to 6 mini HD cameras capturing video at 60 frames per second. The cameras will be mounted directly onto a lightweight helmet along with a microphone to capture the audio and video data. The video data captured by each camera seeks dark contrast dots on the actor’s face that are then reconstructed as positional point cloud data, much like a traditional passive marker mo-cap system. The positional point cloud data can be reapplied to any facial rig that accepts this type of data. 


Cross-pollination of technology in media design
Game developers, such as Activision, applying film technology to the creation of games are an example of the cross-pollination of technology in the field of media design. At the same time, filmmakers are taking technology from the game world, specifically real-time rendering software, and are developing virtual production.

The Modern Art Department
New technology, real-time rendering power, and better 3D modeling software simultaneously enables immersive design visualization within the Art Department. By designing a virtual world for the story to reside, the Art Department provides a hub for collaboration in which to occur, that can be fed into by all other departments, thus providing Previs Artists a world “on the front end” of production, within which to operate and facilitate.
Alex McDowell, Production Designer and 5D Co-founder states:

“In the 'modern' design department, where every designer is using digital tools and working virtually, design visualization is in fact a continual by-product of the design flow. Every design asset created by every designer is potentially an asset for Previs, and the design hub should be considered the front end of a continuous flow of data, through Previs, and to VFX.”

Digital assets normally created by Visual Effects artists on the ‘back end’ are now more and more created on the ‘front end.’  Previs Artists can pull digital design elements from all departments including - Art, Costumes, Cinematography and Visual Effects – and assemble these same elements into a production ‘pipeline’ and use them to pre-visualize shots for early editorial feedback.
Previs acts as bridge from pre-production to production to post-production. It can prevent the movie from being “made three times.” Previs in effect, also creates a prototype that can be tested before the film is made, a new tool in planning, that for all production departments, can result in a dramatic savings of time, effort and cost.

on Saturday, November 12, 2011
Imagine a virtual ocean that is participatory in that you can create a fish, and if your creation is purchased by another participant, you earn money. Strange but true....

Wemo Media and the Los Angeles Chapter of ACM SIGGRAPH held a special behind the scenes look at “The Blu”, a new on-line social media experience in collaboration with Hollywood and Academia on Oct. 11, 2011.

screen capture of user interface published on Deep Sea News


The invite announced:
     "The Wemo Media team is comprised of Academy award winning artists, technology innovators and execs from film, games and the web, including: Andy Jones (Academy award winner, Avatar), Kevin Mack (Academy award winner, What Dreams May Come), Louie Psihoyos (Academy award winner, The Cove) and Joichi Ito (Director, MIT Media lab). 
    Currently in private beta, “The Blu” is the first experience using the Maker Media Platform from Wemo Media. “The Blu” brings the ocean to life on the web. “The Blu” is a beautiful interactive online geosocial web application where people connect across the Internet and explore a vast ocean on the web. Every species and habitat is a unique work of art created by digital artists and developers around the world. Picture hundreds of thousands of aquatic species and tens of thousands of underwater habitats, beautiful works of art created by artists and developers all over the world, spread across the canvas of the World Wide Web as a globally shared art gallery.
     Wemo Media is passionate about enabling artists and developers to co-create globally shared media experiences and storytelling. At the core of Wemo Studios is the Maker Media Platform, a disruptive online creative collaboration platform enabling digital artists and developers around the world to peer-produce innovative digital media."


I asked Neville Spiteri, creator of the project, whether this space would be available for rent, and he thought I meant the studio. I clarified and said 'the virtual space' which is now an asset.  Filmmakers could conceivably use this in production... not sure whether that idea occurred to him before that, but seems like a 'no-brainer' especially since the plan is to create a realistic ocean environment that can be used in part for education and research.

LA Times:  INTO THE BLU

Deep Sea News- Interesting article from a marine biologist:
http://deepseanews.com/2011/11/into-theblu-virtually/
TheBlu website: http://theblu.com The pictures included in this post can be found on Facebook: https://www.facebook.com/intheblu


on Friday, November 11, 2011
TV VFX is delivering like never before thanks to advances in software. Film technology is trickling down to tv at a much faster rate allowing production to meet the very short deadlines- often inside of a week.
Jurrassic Jolts Terra Nova  

NewTek Software hosted an excellent panel of stellar VFX artists working in television on Nov 10, 2011 at the Academy Television Arts & Sciences.

Moderated by David Cohen, Variety Tech reporter with panelists:
Chuck Comisky - VFX Supervisor and Film Producer TERMINATOR 2 3D, ALIENS OF THE DEEP, AVATAR (Stereoscopic VFX Supervisor); On set stereo supervision -SANCTUM 3D, FINAL DESTINATION 5 3D
Doug Drexler – CG Supervisor, VES, Emmy and Academy Award winner for BATTLESTAR GALACTICA and DICK TRACY
Ron Thornton – VFX Producer STAR TREK: THE MOTION PICTURE/Emmy Award winner – BABYLON 5 (VFX Supervisor)
Eric Hance - TERRA NOVA VFX Supervising Artist/Emmy Award winner

David Cohen commented that while in Film there are many VFX driven features, in Television story and content is still king.  Many VFX is minimal and hard to see such as clean-up work or location fixes, however Doug Drexler and Eric Hance are working on two of the heaviest VFX shows on TV at the moment. Both commented on the fast turn-around and the role of VFX with story. They are able to work on the fly directly in production along side ART and Editorial (vs. post) and enjoy direct collaboration that they share with the writers to develop story when needed. Active team participation in production is relished by these artists.
More virtual sets are being used than ever before. The thought is that once production gets used to green screen there will be no going back since the cost savings is so great. There were no Production Designers present and find myself wondering how much input the Production Designer is giving VFX to go on.





on Thursday, November 10, 2011
Tom Walsh,  ADG President,  invited handful of colleagues, myself included, to participate in a roundtable discussion with Dr. Gianluca Sergi, Director Institute for Screen Industries Research, University of Nottingham, UK on Oct 10, 2011.

Sergi is working on a publication based on his findings from a survey of the world's most prominent film studios:

 Joining the dots - or around the world in 80 studios
The University of Nottingham in the UK has begun a programme of research projects, industry engagements with leading practitioners and partnerships with major studios to build the kind of research capacity and skill-base that are necessary to help filmmakers and studios alike build a more stable, sustainable and productive environment within which to work.

As our first research project we asked a 'simple' question: what is the role of traditional studio space in the digital age? The question has emerged out of debates around the introduction of digital technologies coupled with recent large investment in traditional spaces of production (see Warner investment in Leavesden in the UK, Cinecitta' revamping of their Rome's studios, Pinewood opening studios in several countries, and the remarkable success of WETA in New Zealand).

By looking at key studios 'models' around the world, what they have in common and what differentiates them, and how studio production is changing the project provides a snapshot of what the future may have in store for studio production models, its infrastructures and how these may impact on the dynamics of filmmaking.
China and Brazil according to Sergi are poised to release a production 'tsunmai' that will impact US film production in unforseen ways unless studios start talking to each other.  I questioned him whether the studios would ever talk to one another given that they are competitors. I did not get a satisfactory answer to my question... perhaps because he has an academic point of view.

He suggested we learn to speak Mandarin!  The University of Nottingham has a campus in Shanghai.
One anecdote was of the number of skyscrapers being built there. Sergi says they keep building regardless of tenancy (high rate of vacancy) to keep people employed.  In addition they are opening new cinemas at the rate of 3 a day.
What a difference a decade makes!
on Wednesday, November 9, 2011
Los Angeles ACM Siggraph hosted a great filmmaking panel last night at the Art Institute of California in Santa Monica.



"The visceral robot boxing matches in REAL STEEL were conceived, planned and executed using an innovative process that gave the filmmakers unprecedented creative freedom and efficiency – by taking virtual production out of the studio and bringing it into real world settings. From robot animation and lighting to set extensions and arena crowds, the teams at Digital Domain and Giant Studios streamlined production from start to finish for maximum efficiency and photorealism."


Digital Domain Panelists:

Erik Gamache - Animation Supervisor 
Geoffrey Baumann -  Digital Environments Supervisor
Justin van der Lek - Digital Environments Lead 
Blake Sweeney - Digital Lighting Lead 

Casey Schatz - Previs Supervisor from Giant Studios

'Real Steel' is among first film productions, if not THE FIRST to bring on a digital team in preproduction to capture all the real world data from the locations and combine it design data from the Art Department for use in planning of the production from concept phase through editorial and into post production. This is modern filmmaking.

It is the second panel I attended on the making of this film. The Visual Effects Society (VES) hosted a similar panel  following a screening on Oct. 8, 2011 at Disney Studios. That panel inspired me to write an article about previs and virtual production for Perspective Magazine, published by the Art Directors Guild.

on Friday, November 4, 2011
California Poppies
This is a test post using a photo taken with my iphone in Antelope Valley during peak Poppy Season at the Antelope Valley Poppy Reserve in 2009. The iphone has an incredible camera.