Check Your iOS iCloud Settings!

Please go into your iOS device’s Settings.app and check your iCloud settings. Do it now. With a shock I just noticed that all my backups and syncing options were turned off on my iPad mini. And that without me purposefully disabling anything. So do yourself a favor and check those settings every now and then. Otherwise you might end up with no or too old of a backup when you need it.

Shadowrun Returns

I got access to my copy of Shadowrun Returns right on release, July 25th, and immediately ended up playing it of about five hours. Probably your first clue that I liked it.

It has been described as Neverwinter Nights meets XCOM by its creators and that description is pretty accurate. I’d add that it at times also has a little point and click adventure feel to it.

On A Budget

Harebrained Software got $1.8 Million from Kickstarter, which, when you take away all the Kickstarter and Amazon fees plus physical rewards, ends up as about $1.2 Million for the actual game. It later racked up the budget with a loan and additional Pre-sales done via their website, so I’m assuming that the actual budget ended up in the $2 Million range again. Which sounds like a lot, but it really isn’t if you do the math.

Anyway, the developers did a remarkable job of translating the Shadowrun experience into a playable and infinitely expandable game. The environments are gorgeous, the first story that came with it is pretty solid and supposed to be about 12 hours of gameplay.

You can really only tell that they needed to cut stuff by the lack of detail in the characters. They just feel a little unpolished at times. The 3D characters that is. The character art is top notch.

Another thing that’s sorely missing to give the game that extra bit of atmosphere is a voice over track. It’s just a little lackluster to read all the dialogue. But I guess with the idea of an editor that allows you to expand the game ad infinitum, a VO track seems a little unrealistic.

But in the end, Shadowrun Returns remains a great, atmospheric game and anyone even mildly into the Shadowrun universe owes himself to check it out.


Subtitle: Game Review

Prisoners of the Sun Premiere

Today,29.05.2013, was the official premiere of Prisoners of the Sun, the movie I was VFX producer for and that crashed and burned in 2008 due to a licensing issue. Luxx Studios bought the rights a while back and finished it. It turned out to be a modestly solid B-movie with 3D effects that looked very game-y, which is very sad, because we all—the former production crew—knew what it could have been.

Sadly, not having the rights to the material tends to throw a wrench into finishing up a movie, or any project.

FMX 2013 - Iron Man 3

I planned to write about it, but the presentation was just blowing all our minds and I got sucked into simply watching to not miss anything. What can I say? I’m sorry. You should have been there.


Subtitle: Talk by Guy Williams (Weta), Aaron Gilman (Weta)

FMX 2013 - Crowdfunding, Risks and Chances

Do not add tangible rewards below $60! Shipping and manufacturing is simply too expensive. You might not make any money for your actual goal otherwise.

Make the first minute of the info video count! People turn off the video after that time on average.

I’m going to also post a whole array of slides or rephrase them. At least that’s the plan. But first. Sleep and family time.


Subtitle: Talk by Kai Bodensiek (Brehm & v. Moers)

FMX 2013 - Crowdfunding

Phil Tippett had a dream. “Mad God” a film that he wanted to make, but for a variety of reasons got put on hold around the time of Jurassic Park.

The Problem

Lack of money. Simple as that. To get it you used to ask friends, family and people foolish enough to invest In your little project. That lead to a lot of concession being made. Giving the rights to creative input away to investors, having investors kids in the movie, etc.

A Possible Solution - Crowdfunding

Mad God got funding via Kickstarter and Corey talked us through what he has learned.

  • Make your objective clear
    If you cannot define your product, then people have no concept of what they are buying into.
  • Make your pitch personal
  • Don’t come off as aloof. If people sense that you don’t need/want their money, they’ll take it elsewhere.
  • Set a realistic budget
    • As in make a spreadsheet and plan things
    • Limit your reward costs
  • Make your project accessible to backers at all levels
  • Let the project reflect you and your idea
  • Don’t be afraid to fail
  • Treat your campaign as a campaign
    • Advertise via social networks like Facebook, Twitter, App.net, LinkedIn
    • Get people that are influencers to push your project to their followers/readers
    • Write frequent updates during the campaign
    • Ride the wave of spikes and plateaus
  • Treat your backers with respect

Subtitle: How To Raise Money For Any Startup, Video Game Or Creative Endeavor By Corey Rosen, Head Of Creative Marketing, Tippett Studio

FMX 2013 - OpenSubdiv

We startend the session with a quick history of subdivision surfaces. Invented by Pixar and first used in the short Gerry’s Game they made away with the constraints of both polygonal modeling as well as Nurbs modeling.

What’s wrong with Nurbs?

Nurbs surfaces are based on control vertices or hull points through which a b-spline is calculated. This leads to smooth curvature and inherent UVs. Both are good. However, Nurbs modeling relies on adding a multitude of Nurbs patches together to form you final surface. He problem arises at those patch seems where it can, and often is, mathematically impossible to create a seamless surface, much less when the “patchwork” is animated.

So Polygons then?

Short answer: nope.

Long answer: While polygons have no problem with arbitrarily complex surfaces or seem cracks, they have their own set of problems. First, they don’t have inherent UVs and unwrapping a complex mesh for texturing is no small feat. Second, to get smooth surfaces you need a very high amount of polygons or play cheap tricks with normals, which tend to quickly fall apart under scrutiny.

Subdivs

First, the world in general and Pixar in specific only calls then Subdivs. Not subdivision surfaces.

Also, they are the answer to all the problems above. Arbitrarily complex, while always maintaining a definably smooth, crack free surface. Also, with the addition of PTex there is no need for UV unwrapping anymore.

In addition, Subdivs support localized levels of subdivision. What that means is that the whole pipeline can work with the coarse base mesh. The modeler can then go into specific sections that need more definition, locally subdivide that area and make modeling changes there. Those will be saved and applied at render time. Bill showed an example of that at work. In Brave, which was originally supposed to play in winter, they had Merida’s horse run through a snowy plane. The plane itself had a resolution of about a vertex per square meter. Enough resolution to model snow drifts. However, for the horse’s path they locally increased the planes resolution to one square centimeter at render time to capture the fine detail of hooves disturbing the snowy surface.

At rendertime

The term “at rendertime” is misleading, because Pixar is now using a GPU implementation of the subdiv algorithm. The implications of that are far reaching.

At the simplest level “at rendertime” in the paragraph before means, the animator gets a live preview of those several hundred thousand faces in real time in the viewport (Maya’s Viewport 2.0 in this case, which has OpenSubdiv support built in already). Let me restate that, we saw a demo of a “low poly” mesh with about 3000 faces animated with bones that had the OpenSubdiv algorithm applied. What we saw on screen were about 3.8 million faces animated in real time. And since Subdivs have the added benefit of getting displacement at hardly any additional cost, those 4 million polygons were displaced as well. Very intriguing stuff.

This is not only interesting for VFX though. Since this realities GPU implementation also means games that adopt this will get much more visually complex. And in fact, while Bill could not mention any names, he went out of his way to let us know that major mobile company that produces very popular devices we all own and with the power to dictate the chip manufacturers what to put into their chips will implement hardware OpenSubdiv support within the year. Or as Bill put it, you will likely have devices with hardware support built in the next time you see me.

Good for Pixar. How do I get it?

That’s the nice part. You likely already have the technology available to you. For one OpenSubdiv is open source and all the licensing for the technology is available for free as well. Also, if you use Maya, you have access to all this already. Maya uses the exact same algorithm Pixar uses since Maya 5.0 and now also has the GPU implementation through Viewport 2.0.

So no excuses, get cranking!


Subtitle: Talk by Bill Polson, Director of Industry Strategy, Pixar

FMX 2013 - Camera Physics

This talk was very theory heavy with lots of formulas and photos of curves that summarize pretty badly. I still gave it a try here.

The session started with the history of camera tech. From the first wooden boxes with a hand crank.

A hand cranked camera had severe restrictions in that it obviously only could film where a human operator could go and there wasn’t even adjustable focus. Since then we have come a long way. Huge VistaVision camera rigs, car cranes with two seat at the top end of the crane (Titan crane) and Technocranes, which was the first camera crane that didn’t need someone to look through the lens allowing much greater range of camera freedom. This was thanks to Jerry Lewis idea of a video feed.

After that came the motion control rigs, which not only allow repeatable motion to shot many matching passes of a shot, but also allow the combination of live footage with camera matched 3D footage or miniature footage.

Setting The Scene

  • it’s imperative to set up your film back accurately
  • after that setting the focal length should give you the correct field of view
  • you need to set up your nodal point on set correctly
  • live action usually is not nodal, meaning there is a parallax while panning
  • this must be matched in the 3D camera

Real Life Camera Motion In 3D

Cameras must adhere to physics, which means there is a limit to the acceleration of an object. Not speed. Acceleration. Meaning sharp changes in speed, up or down, directional changes.

The acceleration can be derived by the second order derivative of the position change (translation curve). Ideally, you want your 3D curves to accelerate gradually with no more the 9.82 m/s or 1G. Some motion control sYstems can handle more or less.

Of course, there is also a certain speed limit involved.

Problems Capturing Live Motion For Repeatability

A problem arises when you have a camera motion, usually SteadyCam shots, that you need to transform into a motion control shot. For example, you film your actors with a SteadyCam in a greenscreen setup and then you need to repeat that move with a motion control rig on a different set or a miniature.

The way to go about it is to match move (track) the shot, which will give you a rather noisy result, at least in motion control terms, which you won’t be able to program. You can smooth the result, which will loose you accuracy, but enables you to program the shot. The trick is to filter enough, without leading to misalignment.


Subtitle: Talk by Anthony Jacques, VFX Camera Operator

FMX 2013 - Panel Discussion on the Future of the VFX Industry

Not too much to record for this one. It was mostly company heads trying to weasel their way out of loaded questions by the audience and discussion chair Eric Roth. Things like “Do you think a VFX union is a solution to the problems?” “…”

It was interesting to see the panel stammer around the issue. But mostly it was sad that they still don’t see that they are a part of the issue. Companies not standing up and showing some balls.

Example: Pixomondo’s Chris Vogt agreeing that tax incentives are part of the issue, while not disclaiming that Pixomondo is pushing for tax credits here in the Stuttgart area. In fact, the papers are about to be signed within a month.

One of the last questions was good: “If you had 30 million, would you invest it in VFX today?” (After every panelist confirmed that they are feeling positive about the future of the industry.) Not one would use the money to open a VFX company. Very telling. :) Some suggested to invest it in a film fund.

Also, “why do you keep working on a fixed cost model? That’s insane.” A (summarized): “We are no businessmen and also passionate. So it’s mostly our fault, but it’s likely not going to change since our clients won’t go for a cost plus model.”

––––

Subtitle: Panel with Eric Roth (VES), Pierre Buffin (BUF Compagnie), Mark Driscoll (LOOK Effects), Christian Vogt (PIXOMONDO), Jean-Noël Portugal (jnko)

FMX 2013 - The "unfilmable" Life of Pi

Opening with a joke about the botched Oscar ceremony, this is promising to be a good yet sad talk. Rhythm & Hues went from 1000 artists to a small fraction of that recently.

Why was Life of Pi “unfilmable”

Three simple reasons, combined making for a perfect storm:

  • water
  • animals
  • kids (the actor playing Pi while technically not a kid played his first role and could not swim)

Research

Ang Lee did some hands on research of how a life boat or raft behaves in the ocean and how water and waves behave. Authenticity was paramount.

A pool with a wave machine was built in addition to an extensive set of 3D shaders mimicking the pool water. The water had to blend with the CG seamlessly.

Water Shading

  • custom, physically plausible water shader
  • a set of five different water noises to layer and mix the water as needed
    • to avoid tiling a set of noise patterns was used to multiply the effects of the water noise parameters
  • once water was dialed in, it was locked and 3D artists couldn’t change it and needed to adjust the scene to get the desired effect
  • and of course, physical accuracy went out the window as soon as we reach the comp stage or when the director wants the water different then the physical simulation says it should look

Skies

About 140 different extremely high–res HDRi skies were shot and and artists could pick and choose crop outs

Meerkat Island

  • one giant single Banyan tree system with about 6 billion polys per frame
  • up to 45000 meerkats in one shot done with Massive
  • up to 10000 in the shot
  • basically only Pi is real, the island is CG

Richard Parker, the tiger

  • subsurface on the fur
  • new muscle system
  • not anthropomorphized at all
  • based on a real tiger, “King”, from France, he appears in only 23 shots

muscles simulation → subcutaneous skin layer → epidermis layer that slides over the top → covered by fur

All that leads to the realistic wiggling, bouncing and sliding of ski that makes for a realistic animal.

10 million strands of hair lit with area lights, subsurface scattering


Subtitle: Talk by Chris Kenny, Compositing Supervisor, Rhythm & Hues

FMX 2013 - Le Big Shift in VFX

Topics discussed center around what the industry can do to improve interoperability and workflow to strengthen the business instead of running it into the ground.

Open Data Platforms

Rob Bredrow took the lead by talking about the work he and SPI has worked on to create a good open standard onto which companies can build to achieve something greater. Alembic, OpenColorIO, OpenEXR to name a few.

Before every company needed to reinvent the wheel in-house to set itself apart. Now they can work on a common standard which helps with interoperability between companies, which is something that is required in today’s industry.

We have moved from “secret sauce” to common baseline, which also includes the game industry. A convergence of VFX and games in this respect seems inevitable to Cevat from CryTek.

The words “vector of cross-pollination for these industries” were uttered. That should tell you pretty much everything you need to know.[^1]

Cloud Based Solutions

Cloud based computing power is something that is a topic everybody was interested in, even the big houses, who usually have several thousand render farm computers of their own. There are immense draws to this kind of workflow. From lower overheads, due to savings on machine, administration and licensing costs to being able to ramp up render power quickly when you need it during a deadline crunch.

This interest is something that is shared equally between small studios like ours and the big boys. And only expected to expand in the coming years. Ludwig von Reiche talked a bit about cloud computing applications in development, running on Amazon Elastic Computing among others, which he expects to come out within the next year.

While there are already solutions that offer Amazon Cloud rendering, they are usually cobbled together requiring a small science degree to figure out. We are talking about more accessible solutions.

Rob Bredrow argued that there are really two sides to cloud computing. One is the reduced cost due to less inventory and running costs. The other side is to add a lot of render power on demand, say 1000 or more machines to render a shot in an afternoon. That might cost more, but speeds up the creative cycle and might be worth it from that perspective.

[1]: That’s around where I got bored and lost a bit track of the conversation.

Subtitle: Panel Discussion with Marc Petit (Autodesk), Rob Bredow (SPI), David Morin (Autodesk), Don Parker (Shotgun), Ludwig von Reiche (NVIDIA ARC), Cevat Yerli (Crytek)

FMX 2013 - Cloud Atlas

Starting out with a short overview of RiseFX, his company, Florian dove right into the workflow for Cloud Atlas. RiseFX has garnered a reputation for using innovative approaches to set extensions and they didn’t disappoint on Cloud Atlas either.

Starting out with some stats:

  • most expensive German movie to date at 100 million US dollars
  • financed independently internationally
  • lots of famous actors in multiple wildly different roles
  • shot only in European locations standing in for totally different environments
  • pre–production started July 2011 with the previz of 1973 San Fransisco and the Luisa Rey car crash
  • shooting started in August

Car Crash

The previz was very accurately planned and is pretty much verbatim like that in the movie.

Filmed on a bridge in Scotland, which lead to major cleanup work as the bridge was supposed to lead to an island not to mention be in San Francisco. A major challenge was that it plays at night with various camera crane moves showing kilometers of street environment. While you could in theory light that set, there would be huge amounts of lights, power and therefor budget involved.

So the guys went ahead, shot her on the backlot of Rise in Berlin in front of a blue-screen. For the bridge, they made a 3D scan of the environment that got them a textured and completely relightable. Easy.

The crash itself was filmed on a gimbal with added CG trash elements floating around giving a zero gravity effect.

The car crashing into Halle Berry’s car was a 3D car for a simple reason. The director liked the movement of the car in a specified shot, which was unfortunately unusable due to plate errors. So the car was modeled and textured based of all the different takes. Then the preferred shots car was match moved and the 3D car placed into the shot.

Plane Crash

The plane crash was a pretty straightforward Houdini simulation. However, for this project RiseFX adapted a 100% Houdini approach, unlike what most other companies do, simulate in Houdini and model, animate and render in another package. This approach saved the, a lot of headaches as everything could interact and be rendered in the same package.

Which means for the plane explosion that not only could the simulation influence the fluid simulation, but also they could light the geometry with the fluid simulation. Meaning the explosion light everything physically correctly including all the small debris.

Lots of environment re–projection

Having all the sets as Lidar 3D scans meant that they could very easily redress entires sets and just replace them with the 3D model. It also made relighting a lot easier.

––––

Subtitle: Talk by Florian Gellinger

FMX/fxphd Kickoff Meetup

Yesterday evening was the first of likely many fxphd meetups this week. About 15 people showed up and had a relaxed chat. It was a great time to catch up with old friends and meet new faces.

Movies were discussed, Schnitzels eaten and some teased (twitter: johnmontfx text: John Montgomery), who sadly couldn’t make it this year, with pictures of delicious wheat beer like the cruel friend we are.

All in all it was great seeing everyone again and to have some light non-FMX, but of course still VFX chat.

Remote posting to Kirby via iOS now working

Next week is FMX, a great yearly convention about visual effects, games and virtual reality. I’m there every year reporting for Professional Production Magazine and I thought this year I could up my game and write little short blurbs on this blog about the sessions I attend. I asked around and there was some interest from people that could not make it this year or which are interested in the topic.

So far so good. There was only one problem. I switched to using Kirby as the back end system of all my websites including this blog. Kirby is a great lightweight system for web publishing. Only, because it is still so young, it doesn’t have a mobile client that allows you to create posts or pages easily. True, it comes with an admin panel, but that doesn’t work offline and it also doesn’t work on my iPhone. So I was looking for something that better suited my needs. And I had a first version working.—

Drafts, Dropbox, Hazel, Rsync. Easy, really.

Since I only had a few hours here and there, it is still really rough and not as elegant as I’d like, but it works. I am now able to use Drafts to compose my posts on either my iPad or my iPhone, or start on one and finish on the other, since notes are always kept in sync, send the post to Dropbox via Draft’s custom Dropbox actions where it is picked up by Hazel, renamed to fit my Kirby naming convention, put into the right folder and then uploaded via Rsync.

Pictures work, too

So much for plain text posts. But pictures work, too. Though they are still very much a pain. iOS' default camera app records the orientation of the device as EXIF data, so portrait orientation pictures are not actually rotated, but the EXIF Orientation is set to 90° or -90°. Which is all nice and fine, except Safari, and a few other browsers, disregard the EXIF orientation, so images appear in landscape on my site after I upload them. That sucks. The only way around that is to either do anything to the image in Camera+ or another photo editor and save the result, which bakes the rotation into the image. Or to write a function in Hazel that bakes the orientation into the image after I upload it to Dropbox. I’m probably going to go with the latter, since I’m trying to save battery life and using Camera+ a lot sucks down battery life quite a bit.

But I’m getting ahead of myself. To get my images up to Dropbox I use a great little app called CameraSync, which has Geofencing support, meaning I don’t have to do anything to upload the picture to a folder of my choosing on Dropbox. Also, and this is the killer feature, it resizes them before the upload, saving me time and bandwidth.

The biggest pain from here is that I still need to manually move the images into my blog posts folder and rename them from somename–timestamp.jpg to the naming I use for images in my blog. I have not found a smarter way to do it yet then to do it manually. As I said, the whole process is still rough around the edges.

Please update your RSS reader

Effective immediately, I switched over this site’s RSS to URI.LV to handle my feed subscriptions. With Feedburner going down in the near future and Kirby not having its own feed tracking, URI.LV seems like a pretty good replacement.

Also its support rocks. I had a problem with the URL rewrite rules and Maxime, the mind behind URI.LV helped me out within minutes.

Cartoon Movie 2009 — Lyon, France

At the moment, I am in Lyon, France to write about Cartoon Movie 2009. At this conference, motion picture and games are presented to investors and distributors. A wide range of animation styles and storylines is present, which makes the conference really interesting to watch. A lot of great and creative people come together to give each other feedback and to look for new ideas and opportunities. All in all it is an exciting melting pot of like-minded people.

I will be writing a two-page article about it for Professional Production magazine in Germany and probably will present an abridged version of it here when I am done with it. So keep watching this space for more info.

VFX Coordination for Pixomondo

I have been hired at [Pixomondo Images] 1 in Ludwigsburg to work as VFX Coordinator for some of their recent projects. This is an exciting opportunity. The company size is considerably bigger then most companies I have worked so far and I have to handle several smaller projects instead of one or two big projects. That is a new challenge and I am looking forward to master it in the coming weeks.

Being a part of Pixomondo makes me proud, as this is a very professional, yet cool company to work for. I have a chance to stay with them for a longer time if I am not messing up the first 1,5 test months. Wish me luck.

Prisoners of the Sun put on hold

It is official now. The production of “Prisoners of the Sun” has been put on hold. All employees and freelancers have been laid of as of 30th April 2008.

According to rumors, it seems the legal situation of the movie was not researched thoroughly before going into post production.

It is a pity, because we were at a stage, where all the pipelines were set up and we just started our first two full CG shots. We were actually starting to have some good old VFX fun. But not anymore. Now we have 20 people hunting for jobs again.

It really makes me sad to give up a great little company. We had a killer team. Thanks to all of you for being so great.

P.S.: If anyone has an open position for a compositor or VFX producer let me know.

Securing Your Laptop at Work

I wanted to share this litte trick with you for quite a while now, but of course I have been busy and other things came in the way. They always do.

I work with my MacBook Pro at home and at the office. That is of course nice, because it is my only base of operations and I have everything in one place.

But do you know the feeling that you are working with your laptop at work and don’t want anyone to snoop around on it, because it also is your private laptop and has all kinds of personal stuff on it? I know I do.

Luckily there is an easy fix for that. Enable the setting that pops up a password request after your machine wakes from sleep or when it disables the screen saver. But now you always have to enter the password at home, too. Kind of annoying.

There is an easy fix for that as well. The key is location aware software like [MarcoPolo] 1. MarcoPolo allows you to trigger certain actions depending on where it knows you are at the moment. Pretty cool stuff. That means it allows you to enable or disable the screen saver password at home or at work. Neat isn’t it?

I made a little [screencast] [2] walking you through the steps. All you need is MarcoPolo which you can get for free from the [developers website] 1.

If you don’t want to use MarcoPolo you can also use AppleScript to do the same thing with whatever software you prefer. In fact I used the scripts with MarcoPolo, because I overlooked the very convenient built-in action that already does that.

The AppleScripts are as follows:

####Enable Screen Saver Password

tell application "System Events"
	tell security preferences
		get properties
		set properties to {require password to wake:true}
	end tell
end tell

####Disable Screen Saver Password

tell application "System Events"
	tell security preferences
		get properties
		set properties to {require password to wake:false}
	end tell
end tell

Have a look at the screencast for more in-depth info.

I hope you enjoyed this little hint. Check back for more in the future.

Blog Posts Transferred

I finally found the time to transfer all posts from the old BabylonDreams blog over to their new home at this address. I only transferred posts I attached a certain value to, so not everything is mirrored 100%.

If you miss a post or something is broken, please send me a message via the contact form. Thank you and enjoy diving into all the new/old content.