Born in Amsterdam, ever since I was a kid I was mesmerized by immersive cinema. Wouldn’t it be a dream if I could work in that field? Incredibly, here we are.
With a real passion for mentorship and a strong production focused mindset, I also bring a decade of experience with incredibly varied work, from starting in tech support to doing FX in film, episodic and advertising, now being FX Lead in animation with many supervisory duties.
Designed with FX Magic: Oh My Site
As FX Lead of a team of seven, I was responsible for managing and supporting the artists on the team, mentoring the juniors and troubleshooting their FX where needed. I developed some of the larger set-ups that were used across the main sequences.
Another key responsibility was the artistic translation between the director’s vision for the film and the actual effects.
As we did not have a dedicated FX Supervisor I fulfilled a number of those responsibilities as well, such as client calls, liaising with production to get schedules aligned and keeping them on track, flagging potential road blocks early, as well as bidding for all FX across the show.
On Star Trek: Beyond I had the luck to be put on the Yorktown team, where I had the chance to develop a full fledged city generator system to populate the Yorktown space station with cities.
The system I developed would take a city footprint consisting of numerous city blocks as inputs, and would simulate into place the buildings. This enabled me to prevent the buildings from intersecting with each other, as they would collide upon contact. I would then attract the buildings to the edge of a city block, forcing them to align to that edge, making them face the road.
Using Python I stored each city block and its relative location on the Yorktown into a file, to later, once all city blocks had been simulated, pull them back in at the click of a button into their correct locations, forming the entire city. At this stage only the bottom sections of buildings (there were mid and top sections as well) were present.
A post process I wrote mainly in Python would then look at the attributes on these base sections to determine their type and height, to then stack on top as many mid sections as allowed by the skyline height. This height would differ across the city, bound by a minimum and maximum value. Once the maximum value was reached, Python would determine the final section to be a top section.
To define these minimum and maximum values I developed a painting system, using red values between 0 and 1, which would then be multiplied by the maximum height attribute. This system effectively allowed me, or any artist, to paint the skyline.
I would then write out the city as a point cloud, with stored on each point all relevant attributes needed to recreate the city in Clarisse, our lighting tool. The actual building sections would then be instanced upon my point cloud, which could then be rendered to reveal the final image.
Thanks to this point cloud instancing approach we were able to display the cities and the Yorktown itself, consisting of 1.3 trillion polygons, in real time.
Due to the vast nature of the task at hand – a space station being 16 miles in diameter consisting of 64 of these cities – I had to automate and proceduralize the system as much as possible.
Again using various Python scripts throughout the set-up I was able to automatically drive this set-up. Generating a city would now take less than an hour, and as such we were able to quickly turn around changes in client briefs, lay-out, art direction and so forth.
Cinefex 148 makes mention of DNeg’s “new generation proprietary urban layout software” on page 77, of which my city generator system is one of three key parts.
I was lucky to be part of an incredibly talented team who taught me more in that time than I could have imagined. For this, I am immensely grateful.
Other FX work on Star Trek: Beyond includes volumetric lighting and rendering, creating moving debris clouds, smoke simulation/advection for nebulae and fog, and rigid body dynamics for colliding asteroid fields.
A key contribution to the project was the creation of a reusable boat wake tool, creating a trail behind the ships in the ocean surfaces without altering said oceans or needing any kind of resimulating. Using Python and VEX I created a trail set-up which would procedurally create a trail behind the moving ships, with user control for, among other things, fall-off, wave frequency, shape and trail length.
Using colour mapping I then mixed in the trail surface with the existing ocean surface, to seamlessly blend the two together. The trail surface being flatter and more turbulent than the ocean surface.
To create the correct motion on the foam and white water in the trail, I used FLIP simulations to create a bespoke velocity field for the particles to follow and mixed that into the surface of the trail. This I then added to the surface and velocities of the ocean surface before simulating the white water, creating visible churning in the boat wakes.
For lighting to be able to render the changes my team and I made to their surfaces, I had to also develop a masking tool that would store the displacements of my trail set-up into attributes, so lighting could pick it up.
Other work includes lots of white water simulations for the ships in the ocean, and also on the water walls in the sequence where the ocean splits apart. I also did FLIP simulations on some of the ocean shots and water wall shots, mostly splashes, and in a few cases simulating the front end of the boat to create some more wave motion on the ocean surfaces where required.
Lastly, I helped the team get the ghost effects through. These are the floating effects you can see around Salazar’s crew and himself as they move about. In the fight sequences these effects proved tricky, but they were fun to do.
On this project I ended up doing quite a variety of work. Most notably I developed a system that would generate roads with intersections along with street lights and traffic lights along said intersections, for the CG Shanghai shots. I also created a tool that would procedurally scatter lit windows around the CG buildings for the same sequence.
The traffic lights on the intersections were scripted to change phase, so going between green and red, and I also wrote a script that allowed CG cars to drive automatically along the roads, obeying the traffic lights and avoiding collisions with oncoming cars and of course with cars in front of them. They could also change lanes when turning, with animated turn signals.
The whole system was scripted in Python and allowed us to do a multitude of large city scape shots without having to all the work manually, which helped us enormously to keep the scope manageable.
For the Paris shots I created a tool that could animate the CG metro trains and keep them on the tracks without them having to be hand animated, controlling for speed and some banking in turns. The underground metro shots have been done with that system.
Of course there were more conventional effects in the mix as well, such as gunfire in the Paris metro shots and mainly a large explosion in the Shanghai tower.
For this project I was tasked with creating falling tinsel from the bomber fleets in the Dresden sequence. Tricky little things that needed to be art directable in their movement and also had to come out of the airplanes in blocks, then come apart. For many shots where the family are standing gazing at Dresden this was a picky process of deciding exactly where they would and how many would fall, so as not to upstage the main characters.
The bomber fleets themselves were also animated by a tool I created, with a bit of pitch and yaw on the planes while keeping in a V formation on their way to the target. The planes would also drop rows of bombs, which on their way down would need to pitch forward and down as they went, turning vertical before impact. In addition to that I created a procedural system that would trigger a flash each time a would hit, avoiding the need to doing all of that manually. All of these behaviours in the bombing sequence were scripted using VEX.
For Birds of Paradise we had a unique challenge to solve. In the ballet sequence where the two girls would dance around each other in a psychedelic episode in a nightclub, we had to create the illusion of endless amounts of glitter being emitted from them. We had no CG doubles or any real tracking for this sequence.
I ended up opting for a KineFX solution, using the standard dummy model that comes with Houdini and quick rigging it with KineFX. We had made a recreation of the dance floor so I had some sense of where the girls were in space, and so I was able to do a 3D match of them with the KineFX rigged dummy. KineFX is incredibly flexible and quick to use so I was able to do about six shots in a day while I had some help from the Maya team which yielded another two shots per day.
These CG doubles now allowed me to calculate accurate velocities and also gave me geometry to emit the glitter from. This gave us much better and more dynamic results than with a previously considered 2D outline approach, which lacked any sense of directionality and was not able to produce and swirls and sweeps in the same way, deriving from the dance moves, in the way that KineFX was able to deliver that.
On Brothers Grimsby I was tasked with rocket trail effects. The brothers has themselves sat on top of fireworks rockets, which were smouldering and smoking before being launched. Right on and after launch, inside the smoke simulations some particles were advected which were rendered as embers and sparks.
Amundsen presented a challenge again as we didn’t have the best tracking information. Aside from crumbling snow in a few smaller shots, I was tasked with an effect for breaking a large snow cover over a crevice into which one of the men would fall, along with his sled.
This was difficult to get right visually as there was a mismatch between the shot plate and the CG camera and geometries we had to work with. Nonetheless, I believe the effect itself was really quite successful, with an RBD sim slowly breaking outward, covered in a layer of clumped grains to simulate the snow on top.
On Nautilus I was mainly responsible for large scale FLIP simulations. The most prolific shot of all being the big breaching shot depicted here.
For the Kalpani Escape sequence, where the Nautilus is stolen from the port by Nemo and his crew, I developed a set of techniques that allowed the Nautilus to be simulated in place with the FLIP sim going around it, as opposed to simulating a moving Nautilus in a moving tank. The upside of this technique was that it provided a lot of control on the details of the simulations as well as being more efficient and faster to run. By controlling the velocities and sculpting the velocity field used in the sim we had fairly fine control over things like wave height and the amount of churning and splashing.
The downside of the technique was that the white water tools sort of operated on reverse, generating white water where you would not expect it and vice versa. For example the highest velocities were now in between the waves while the actual waves that were generated were now lower velocity areas. Of course vorticity was still similar, but it required hacking the white water tools to make them do the thing we needed them to do, and customising the tools a fair bit.
This show was all about magic effects. The magic effects I created include the Dowling death scene where she turns into a glittering cloud of particles, as well the magic prison effect that is used to imprison Bloom.
The Dowling death scene was really quite tricky because we didn’t have brilliantly tracked CG doubles and for the aerial shots the cameras were a bit of guess work, which complicated matters.
For the magic prison effect the aim was to mimic the effects from the previous season while updating it a bit, which is where the particle effects on the leading rim come in.
My favourite effect is Stella’s glowing orb. This was a fun one because it required a good understanding of compositing to make the effect work, as the FX renders were mainly technical passes for manipulation in Nuke and not “ready to go” beauty renders. I love how the effect is small, beautiful and subtle instead of big and brash. I developed this effect together with Zakaria Bouamama.
For Willow I did a custom growth set-up for the growing magic bush, at the end of episode two. This was hands down one of the trickiest shots I have ever done as there is nothing out of the box here, and the client was very specific about what they wanted for this shot.
Writing a custom solver, heavy on VEX scripts, I was able to make the bush grow and branch out where directed. The speed was controllable, and so was the thickness of the stem and branches. The effect also included animated and unfurling leaves, as well as various berry clusters with changing colours to reflect ripening over time.
The soil around the bush is moved by outgrowing roots that stay close to the surface. These break apart a Vellum Grains simulation, creating sandy cracks and rolling motion in the soil.
As the bush grows and the berries mature, the branches bend down slightly under their increased weight.
Blood. Everywhere. On Cursed I was tasked with blood effects for the fights scene where Nimue attacks a group of crooked monks. This was a challenging sequence as some of the shots were slow motion while others were very long, requiring a slightly different approach each time.
I used FLIP simulations for the blood effects, using mainly viscosity and surface tensions to keep the blood patches together. I used a custom meshing approach to avoid popping in the meshes, and the shading, lighting and rendering are also by my hand.
Other effects work includes things like ropes and chains on various sets and on the castle shots, as well as a whole host of waving flags and banners across the episodes worked on.
On Alex rider I developed the effect where a Mayan archer would disintegrate into variously sized voxels (cubes) when struck by an arrow. I initially did this by scattering points inside the target object, divided into equal square parts, and instancing cubes on top, which would inherit the colours of the texture of the model. When struck, they would blow apart and fall on the ground, using a Bullet sim, then shrinking and disappearing. Controls on the velocities and directional biasing allowed for art-directability on where and how the cubes would fall.
Other effects on the show include shattering glass and smoke effects on the greenhouse drone attack sequence.
A smaller show on my part, with some fairly straightforward destruction of an object struck by a hammer. I did have to develop my own 3D double objects and track them into the shot as this was an ad-hoc request that was not initially accounted for. I did my own shading, lighting and rendering here too. Even though it’s a few small effects, I do feel they came out rather well.
For Locke & Key we did a lot of sequences, mainly focused around fire, or pyro FX. One of the sequences involve a firewall. For this effect I animated the source which would draw a line, to be set on fire in a pyro solver. Controls were built in to the toolsets to create some variety on the source, in the fuel and temperature fields.
The height of the firewall was controllable by adjusting the values in the source, so we could tweak the outline of the wall as required.
Another key sequence (no pun intended) was the mannequin sequence. A key would be jammed into animated mannequins, which would start a fire effects. Using controllers and a spreading effect, the fire source would grow across the surface of the mannequins, setting them on fire and eventually enveloping them fully. A secondary particle simulation was used to emit embers, tied in with the pyro effects.
A more subtle effect, but one that I like for that reason, is the burning travel bag. With the family stood around it, Tyler sets it on fire with the key. For this I did a simple fire effect, but one I feel has turned out quite realistic.
On Locke & Key I also did the shading, lighting and rendering for my effects shots.
This show involved a large variety of effects. From filling a sinking boat with water using FLIP to launching a rocket, parachutes and multiple smaller supporting effects.
The parachutes were used in the Venezuela jump sequences, where a group of military men jump out of the back of an airplane and land in the ocean with their gear. The parachutes in some of these shots were done by me using a Vellum cloth sim. For some shots in the water, the cables attached to them were also simulated with Vellum.
The opening shot to one of the episodes is a rocket launch done by me. Quite far in the distance, so forgiving you might think, but the length of the smoke stack this required made up for that. A fun set of shots, which looked pretty cool in the end.
Smaller supportive effects include exhaust fumes from the airplane engines, propeller rotation, and other such more or less invisible effects.
This was a show full of magic. My main contribution to this show was a magical paper effect where a stack of paper was dropped on the floor, and blew away. The stack then had to magically form itself again.
To achieve the paper effect I simulated paper sheets being blown away and then rewinding that to get the paper stacking effect. The reason for this approach was that making the paper fly toward a specific spot and fold into its position nicely was extremely difficult to get right, so I opted for blowing them away from a stacked position and then reversing that.
I then simulated a final paper sheet on top of it, separate from the original simulation, with a little post-it stuck to it as well. This meant I could break the impression of rewinding something and give it a unique touch by doing a bespoke simulation of a single sheet on top.
Other effects include a swarm of birds taking off from a campus building roof and the initial set-up for the “paper storm” effect in the library, where sheets of paper fly around frantically in a magic spell.
Honda: the Power of Dreams was a fun ad to work on. This was at Electric Theatre Collective, where we were to put a scaffold with letters up for each word in the line of a song. Initially the plan was to do this by modelling each by hand, but I suggested we could do this mostly procedurally and so I developed a tool that would build the word, put a scaffold behind it, and could illuminate the letters using an activator.
The letters themselves were pre-modelled. I then pulled them into a tool, written with Python, that would take a word as input, and place the letters. Using a menu a type of scaffold could then be picked by the artist (there were three types of scaffold) and controls would be built for per-letter kerning, a request from James Sindle who was placing the words in the scene at the time.
This project thus caused me to learn how to build menu items, buttons, drop down menus and all manner of editable parameters on a custom built UI and tool in Houdini.
The end result was an art-directable set-up which made it easy and quick to generate the lettering scaffolds, put them where they were needed and animate the lights switching on with ease.
For the new Seat Leon ad I had a very “non-fx” task for a change, but one that tapped nicely into my skillset of procedural modeling. For this ad I create a fence tool, that would build a chain link fence complete with posts, barbed wire and imperfections like dents and holes.
I created the chain link by modelling only one half of a square and then replicating across and up, as required for the width and height of the fence. Posts would be spaced along it, controllable with a parameter for the interval, and the chain link fence would be warped last, using a mix of noised and attribute painting to control where it would bend and sag.
For Go Compare I did the cloth simulations for a number of their ads. This includes the jeans and t-shirt on Monster Bill. To make the cloth simulations work a tool was used first to removed intersections on the animated model as otherwise the simulations would explode. The shading, lighting and rendering were all done in Houdini.
Because Monster Bill had hair all over, these hairs would intersect the clothes. To resolve this I wrote a python script that would scan the shot, check the simulated cloth caches against the groom and store each intersecting hair it found in an array. This meant we were able to simply delete the offending hairs from the start of the shot and get clean renders out, without having to simulate the hairs, saving us a lot of time.
On Finlandia I created a set-up that would take a grey tone map of the backplate with the bright sun and its rays being used as an activator for the flower field. We had the field matched in 3D with points scattered across it onto which a flower would be instanced. When touched by the light of the sun they would animate and open up. The shot was eventually finaled by Max Johnson as I was moved to another project.
One of the key effects on Dyson was a CG carpet. We created a CG carpet which the camera could dive into as the Dyson vacuum cleaner approached. As it got closer, using guide hairs, the carpet strands would bend towards it and flick frantically as if in the path of its suction.
Using a particle simulation, with bits of debris instanced on, I simulated dirt and debris being sucked into the Dyson from the carpet.
For the vortex effect inside the transparent canister emulating the strong suction of the Dyson vacuum technology, I used a custom velocity field derived from drawn curves through which a smoke simulation was run, as well as a large amount of tiny particles. Together these make up the vortex effect as seen in the Dyson ad.
The Einstok project was an unofficial ad from students at the Filmakademie of Baden-Wurttemberg whom I cooperated with. For this project I produced a series of weather effects. The heavy rain was done using motion blurred particles with a droplet instanced on top. I built a little tool that could be used to control the direction of the rain by moving a red arrow in the viewport to point in the direction of the rain and wind.
The fog effects were achieved using a volume with a noise run through them and some faked advection by using the offset parameters to move the noise along the same vector as the rain. Altogether this made for some wild looking weather effects across a number of shots in the Einstok ad.