Monday, July 24, 2017

3ds Max plugin developer wanted

As much as we despise Autodesk and would rather see the entire company go down in a pool of radioactive, fiery plasma (the eyebrow-scorching kind that is), the fact of the matter is that a sizeable chunk of the 3d artists out there remains loyal to 3ds Max for whatever reason. Due to this shocking fact, we're looking for an outstanding 3ds Max plugin developer with the skills to integrate our technology into 3ds Max (this role is in addition to the two roles advertised in the previous post: the graphics developer and full-stack developer).    

What we're looking for:

  • 2+ years of experience developing plug-ins for 3ds Max
  • Solid understanding of 3d artist workflows
  • Experience with rendering (this is a rendering plug-in)
  • Knowledge of real-time data streaming protocols and technologies (WebSocket etc.) desirable
  • Keen to keep abreast of the latest cutting-edge technologies in the fields of graphics and rendering

This is a remote contracting role. Send your application to sam.lapere@live.be

Friday, July 21, 2017

Excellent computer graphics developers wanted

As we are nearing the launch of our product, our team is expanding. We're currently looking for a graphics developer and a full-stack developer to join our team in NZ with the following skills and experience:

3D Graphics Developer

  • Bachelor, Master or PhD in Computer Science or similar field
  • Specialisation in computer graphics
  • (Constructive) Solid experience with parametric and non-parametric 3D modelling algorithms
  • Strong mathematical background (especially linear algebra + multivariable calculus)
  • Very good command of C++11 and OpenGL
  • Web development experience desirable
  • Experience with functional programming languages such as Erlang and Haskell is a plus (but not required)
  • Love for learning cutting edge experimental languages and frameworks
  • Flexible, can-do attitude
  • Perfectionist attitude and obsessed with quality
  • Be part of a very fast moving team
  • Keen to move, live and work in New Zealand

Full-Stack Web Developer

We're also looking for a top notch full-stack web developer to join our team. Candidates for this role should have:

  • Bachelor or Master in Computer Science or equivalent field
  • Minimum of 3 years of working experience with front-end and back-end web development (e.g. Django, Angular.js and Bootstrap)
  • Hands-on experience with and an unbounded passion for real-time high quality 3D graphics (WebGL, physically based rendering, see e.g. https://github.com/erichlof/THREE.js-PathTracing-Renderer#threejs-pathtracing-renderer)
  • Experience with dynamic languages such as Go desirable
  • Knowledge of WebAssembly desirable
  • Creative and original problem solving skills
  • Unrelentless hunger to learn more and become an expert in your field
  • UI design skills are a plus
  • Ability to work independently
  • Show initiative and be highly motivated, perfectionist and driven
  • Keen on working in NZ 

Send your CV to sam.lapere@live.be 

Applications will close once we find the right candidates to fill the role

Sunday, July 9, 2017

Towards real-time path tracing: An Efficient Denoising Algorithm for Global Illumination

July is a great month for rendering enthusiasts: there's of course Siggraph, but the most exciting conference is High Performance Graphics, which focuses on (real-time) ray tracing. One of the more interesting sounding papers is titled: "Towards real-time path tracing: An Efficient Denoising Algorithm for Global Illumination" by Mara, McGuire, Bitterli and Jarosz, which was released a couple of days ago. The paper, video and source code can be found at


Abstract 
We propose a hybrid ray-tracing/rasterization strategy for realtime rendering enabled by a fast new denoising method. We factor global illumination into direct light at rasterized primary surfaces and two indirect lighting terms, each estimated with one pathtraced sample per pixel. Our factorization enables efficient (biased) reconstruction by denoising light without blurring materials. We demonstrate denoising in under 10 ms per 1280×720 frame, compare results against the leading offline denoising methods, and include a supplement with source code, video, and data.

While the premise of the paper sounds incredibly exciting, the results are disappointing. The denoising filter does a great job filtering almost all the noise (apart from some noise which is still visible in reflections), but at the same it kills pretty much all the realism that path tracing is famous for, producing flat and lifeless images. Even the first Crysis from 10 years ago (the first game with SSAO) looks distinctly better. I don't think applying such aggressive filtering algorithms to a path tracer will convince game developers to make the switch to path traced rendering anytime soon. A comparison with ground truth reference images (rendered to 5000 samples or more) is also lacking from some reason. 

At the same conference, a very similar paper will be presented titled "Spatiotemporal Variance-Guided Filtering: Real-Time Reconstruction for Path-Traced Global Illumination". 

Abstract 
We introduce a reconstruction algorithm that generates a temporally stable sequence of images from one path-per-pixel global illumination. To handle such noisy input, we use temporal accumulation to increase the effective sample count and spatiotemporal luminance variance estimates to drive a hierarchical, image-space wavelet filter. This hierarchy allows us to distinguish between noise and detail at multiple scales using luminance variance.  
Physically-based light transport is a longstanding goal for real-time computer graphics. While modern games use limited forms of ray tracing, physically-based Monte Carlo global illumination does not meet their 30 Hz minimal performance requirement. Looking ahead to fully dynamic, real-time path tracing, we expect this to only be feasible using a small number of paths per pixel. As such, image reconstruction using low sample counts is key to bringing path tracing to real-time. When compared to prior interactive reconstruction filters, our work gives approximately 10x more temporally stable results, matched references images 5-47% better (according to SSIM), and runs in just 10 ms (+/- 15%) on modern graphics hardware at 1920x1080 resolution.
It's going to be interesting to see if the method in this paper produces more convincing results that the other paper. Either way HPG has a bunch more interesting papers which are worth keeping an eye on.

UPDATE (16 July): Christoph Schied from Nvidia and KIT, emailed me a link to the paper's preprint and video at http://cg.ivd.kit.edu/svgf.php Thanks Christoph!

Video screengrab:


Not convinced by the quality of filtered path traced rendering at 1 sample per pixel, but perhaps the improvements in spatio-temporal stability of this noise filter can be quite helpful for filtering animated sequences at higher sample rates.

UPDATE (23 July) There is another denoising paper out from Nvidia: "Interactive Reconstruction of Monte Carlo Image Sequences using a Recurrent Denoising Autoencoder" which uses machine learning to reconstruct the image.


Abstract 
We describe a machine learning technique for reconstructing image se- quences rendered using Monte Carlo methods. Our primary focus is on reconstruction of global illumination with extremely low sampling budgets at interactive rates. Motivated by recent advances in image restoration with deep convolutional networks, we propose a variant of these networks better suited to the class of noise present in Monte Carlo rendering. We allow for much larger pixel neighborhoods to be taken into account, while also improving execution speed by an order of magnitude. Our primary contri- bution is the addition of recurrent connections to the network in order to drastically improve temporal stability for sequences of sparsely sampled input images. Our method also has the desirable property of automatically modeling relationships based on auxiliary per-pixel input channels, such as depth and normals. We show signi cantly higher quality results compared to existing methods that run at comparable speeds, and furthermore argue a clear path for making our method run at realtime rates in the near future.

Monday, July 3, 2017

Beta testers wanted

In the past several months, we have been developing a novel ultrafast photorealistic rendering application and we're almost ready to unleash our beast onto the world. In our humble opinion, we think our innovative, pioneering and revolutionary tech is going to be groundbreaking, earth-shaking, paradigm-shifting, status quo defying, industry-disrupting and transmogrifying, and be greater than the Second Coming of Sliced Bread! In short, we think it's going to be rather good.

We're currently looking for some outstanding beta-testers who have extensive experience with one of the following 3d modeling packages:

- 3ds Max
- Maya
- Cinema 4D
- Modo
- Blender
- LightWave 3D
- SketchUp

and a ray tracing based rendering engine like V-Ray, Corona, Cycles or similar.

The perfect candidate has also won or been nominated for a Montgomery Burns Award for Outstanding Achievement in the Field of Excellence.

To apply, send an email with a link to your artist portfolio to sam.lapere@live.be (people with low frustration tolerance need not apply).

UPDATE: Applications are now closed. Thanks to all who have applied.