Blender Internal + unbiased rendering

In this week-end, while I was waiting for Matt’s photon mapper refinements in order to avoid code duplication, I also started to code another extension to the Render Internal: Bidirectional Monte Carlo Path Tracing that will allow, for the most exigent users to fully resolve the rendering equation with a controlable error bound :), that’s pretty similar at
what Indigo and Kerkythea do.

fig1

Fig. 1 –  Path traced image generated in blender internal

The world of CG is very complex and broad, and that ‘s good , because it aims to model both the reality and the surreality. For that reason there’s no single render set up that works for all cases you want to make, that ‘s why having a full featured render at hand is very important.

Many times I’ve found in the topics (though some hate them, they are by no means useless, people always  need to compare, and things always change): Speed vs Accuracy vs Quality vs Realism vs NPR vs Biased vs Unbiased vs Tools vs Artists and so. All of them are terms in the life equation and together form a balance: In spanish language there ‘s a phrase named the BBB phrase:

“ Bueno, Bonito y Barato no existe ”
“Good, Beautiful and Cheap don ́t exist together”

(like “Cheap, Fast and Good, pick two”)

While in real life there are lots of examples of the contrary (Linux is BBB , and of course our beloved Blender is BBB and have done a BBB movie 🙂 ),  the deep meaning of that phrase is that in all the cases of correlated variables, as you gain in one value you loose in the opposite. In order to gain Speed you need to sacrifice Quality and Accuracy. Realism is not always a goal but has driven the CG industry and is the engine of its development.  NPR are free expressions in CG, they allow the artist to express himself the way he/she want or only represent the essential of a thing like cartoons (Freestyle is a big plus for Blender)

Many artist subvalorate the so-­called Intelligent renderers, like photorealistic ones (V­Ray, Mental Ray,Final Render, Brazil,Indigo,Kerkythea and so) saying that those renderers accommodate artist to not learn the foundations of illumination and composition since the must dumb set-ups look cool on them. That ‘s not necessarily true, intelligent renderers free the artist of low level tasks and instead they could focuse themself on the real artistic side. Everytime men are freed of low level task, higher arts and sciences could be done. Of course, foundations will always be valid, and smart tools in wrong hands will do less than dumb tools in experts hands. But there ‘s a subtle detail often forgotten: the tools do matters in the artistic expression, one idea will be differently expressed with different tools and materials and will have different impact on the receptor according with that (my father is an oil painter, my sister is a skilled drawer). It’s the artist who creates the masterpiece and with same widespread general tools the final result will depend on the artist only, but it’s the tool that time-wise limits/casts the artist expression capability (doing the joconde with paint instead of gimp or photoshop is indeed possible for an real artist, but how long does it take and how constraining is it (no layers for precise control, no gradient system > very rigid)?)

Biased vs Unbiased renderers: Here I will make a little stop because there ‘s a lot of confusion about those terms: Simply put, unbiased means that on average, the results of the render will converge to the correct solution (averaging any number of  1 sample per pixel, unbiased renderings will converge to the correct image), that ‘s why unbiased renderers are always taken as references for quality control because their results will tend to the correct solution if the inputs are correct or at least the same as the other compared rendering methods. Unbiased  renderers try to fully solve the rendering equation, a very intuitive equation but is a monstrosity that involves infinities at whatever side you look at because nature is unlimited 🙂

And the better way to solve with a desired accuracy such equation are through Monte Carlo methods. So typically the error in those renderers are bound to noises.  On the other hand, biased renderers make a series of simplification to the rendering equation to gain speed: deterministic raytracing (Blender already has one) doesn ́t account for diffuse interreflection effects (GI), caustics and so. Deterministic Raycasting (Volumetrics) , in development for Blender adds another term of the rendering equation to the simplificated raytracing equation, Photon mapping (In development for Blender 🙂 further extend the capabilities of deterministic raytracers to include GI and caustics in a biased way: averaging any number of low resolution photon map renderings of caustics will not converge to a sharper correct caustic .

The advantages of Intelligent designed biased renderers are that pushes the speed without loosing to much in accuracy nor in quality. But the brute force unbiased renderers will always be the rule to measure them as raycasting is the rule to measure all the others volumetric rendering methods (Shear warp,splatting,3d texturing and so) and always will bring up at no cost all the real life behavior of the light.

So this weekend, for learning purposes, in order to stress the flexibility of Blender render design and trying to set up a comparison rule for the photon mapper since I have found some artifacts derived from the biased nature of it:

fig2

Fig. 2 –  deterministic artifacts of caustics in photon mapping

I started the implementation of a Bidirectional Monte Carlo Path Tracing inside Blender, and I was happily surprised how “easily” blender can be extended with it.  Of course, currently blender doesn ́t have any physically correct material type (perhaps soon will have it) so putting the wrong input lead to “A correct answer to a wrong problem”.
But still, as a proof of concept and as a future possibility it will be useful to have integrated in Blender a Path Tracer, those kind of renderers have their own user base that currently blender lacks and also the Path Tracer fully complete the missing terms of the rendering equation so there will be minor features that blender Render Internal will not have compared to ANY renderers over there . One of the good things of being a CG programmer is that once you know a little about the underlyings of how CG work you realize that many of the marketing features that sell a product come at a very little cost  (I don’t mean they were trivial to program, sometimes the simplest of the algorithm involves years of research/development !! ). Here is a small list of the features that came for “free” in Monte Carlo Path Tracing
rendering algorithms simply by performing random samplings and recursions:

• Sampling a pixel over (x, y) prefilters the image and reduces aliasing.
• Sampling the camera aperture (u, v) produces depth of field.
• Sampling in time t (the shutter) produces motion blur.
• Sampling in wavelength λ simulates spectral effects such as dispersion
• Sampling the reflection function produces blurred reflection.
• Sampling the tranmission function produces blurred transmission.
• Sampling the solid angle of the light sources produces penumbras and soft
shadows.
• Sampling paths accounts for interreflection.
(All of them are the flashy features of any unbiased renderer 😉
So, Imagine the day when you could fully solve the rendering equation in Blender without going out 🙂 ! Off course, none of these are finished in a few days, remember the BBB phrase, and also  I’m currently on Volumetrics (priority) and Photon mapping, I can’t promise anything yet , I’m just giving some advances of my development line in the Render module that will be toward fully resolve the rendering equation with Blender render internal.

This is also an ambitious project like volumetrics  so  help is very welcome, I have a lot to learn in the process and as always my main slowdowns are the unknowns of Blender inners that hope will be less and less over time . In the process many subgoals should be met (some of them by me, some of them by others).  I just coded a draft for test viability and here are some renderings showing only GI, the main use case of unbiased renderers are Archiviz, stills, referencing, etc.

Note: these pictures are direct output of the Path Tracer, no previous render algorithms or filters were performed

fig3

Fig. 3 – 10 samples per pixel (spp) –  render time: 4.35 s

fig4

Fig. 4 – 100 spp –  34 s

fig5

Fig. 5 – 1000 spp – 5 min 34 s

fig6

Fig. 6 – 4000 spp –  22 min

fig7

Fig. 7 – 10000 spp –  56 min

fig8

Fig. 8 – 100 spp –  44 s

fig9

fig10

Fig. 10 – 4000 spp – 28 min

As always I hope you like it
Raúl Fernández Hernández (Farsthary)

Advertisements
Blender Internal + unbiased rendering

39 thoughts on “Blender Internal + unbiased rendering

  1. joel says:

    amazing work farsthary!
    but dont you think you’re reinventing the wheel a little ?
    i mean.. Luxrender is an amazing project and one of it’s goals is to fully integrate in Blender.
    They have a full team to work on their project.
    eventhough i love the work, dont you think it’s better for Blender internal to be a biased renderer ? (seeing as it will be used allot for animations)

    great work!

    Like

  2. UglyMike says:

    This is like…..wow.
    I’m starting to have a suspicion that Farsthary doesn’t really exists. In reallity he’s probably half a university organized by Ton and hiding behind a avatar. I mean, no single person can simply come out of the blue and code all this stuff in such a short timeframe….

    As for the BBB (Bueno, Bonito y Barato no existe), I’ve hear it said in English as follows “Cheap, Fast and Good, pick two”

    Like

  3. Ruddy says:

    Hi Joel, I think this is part of his learning/research process: the more he discovers at programming, the best it is to think 🙂
    He said himself that it was pretty quick to code, so as long as the render api integration is not started…
    Moreover, few blender users know how to install these high-end wheels and they will have to wait some time before perfect and easy coupling, so if they can have in a few months 4 wheels instead of 3 from the start as a bonus, they won’t complain 🙂 (I don’t speak about the users with specific needs; they will always prefer solutions taylored from the start for their needs)

    Like

  4. Ruddy says:

    Another point: he is not coding from many years (he was first a student in Physics), so it would perhaps have taken him far more time then 2 days to get used with the source code of these programs?

    Like

  5. hannibar says:

    Hi,

    great work here. I was wondering which algorithm you implemented though. Did you make a path tracer or a bidirectional path tracer ? And if it is a bidirectional pathtracer, did you use Multiple Importance Sampling to combine the different paths ?

    Like

  6. Ruddy says:

    LOL uglymike 🙂
    He codes about 4 hrs daily (from midnight to 4 o’clock, tropical schedule) with no distractions on his computer (no internet, no movies)..

    Like

  7. toontje says:

    Uglymike, I can’t help but to think that you may be right.
    I think there is 2 options here:
    1) 20 guys hiding behind a avatar
    2) The educational system in Cuba is one of the best. I mean really, I personally know a couple of Cuban doctors, and no other ordinary doctor can hold a candle to them. Then you have also guys like Capablanca.

    Like

  8. sk2k says:

    Holy moly!!!

    This IS great. I was waiting for a feature like this for the Blender internal renderer. No need anymore to puzzle around with exporting meshes/materials into external renderers.

    MfG
    sk2k

    Like

  9. PhilBo says:

    I know that some would say that these things should stay in Kerkythea, Indigo or Luxrender and keep BI biased. I have no problem using the other programs and do often.

    To me, it is such a pain in the neck to set up materials and such in Blender using scripts, only to find out that an update to the script renders the materials useless. I’d love to have the features of the other engines, but still use Blenders materials natively. For me, that is the huge gain.

    Fantastic work. Really, bravo.

    Like

  10. Aka says:

    It would be awesome to have physically correct materials in Blender. BRDF are way better for creating nice mats. Especially when it comes to Arch/Viz. This is a must have.

    Like

  11. Dennis F. says:

    Farsthary 🙂 …. this is amazing … i thank you both soo much for your works … this are things that many people await a long time …. wow!

    Like

  12. raul thats amazing!! it looks great, do you know of luxrender? maybe you can borrow code from it? its open source, its currently what I use for my unbiased needs 🙂

    luxrender.net

    Like

  13. scorpius says:

    Wow! This is truly wonderful news. I would have been happy with a plain internal path-tracer, but having it be unbiased would make me even happier.

    Keep up the excellent work.

    Like

  14. Aka says:

    So. As for now from mailing lists there is a discussion about BRDF an physically accurate mats. Would be nice two to have those in Blender. As scanline materials are cumbersome to setup for Arch/Viz, and ubiased methods are faster way to go. Especially when u have a dead line. Any way thx for your support. xD It’s a big deal to have those inside BI.

    Like

  15. In the past year I’ve only just become aware of unbiased renderers. I like Indigo a lot! It feels so much like shooting a picture with my old camera. Thanks for your developments in this direction, but I think I speak for everyone when I say that I’m excited to see volumetrics be completed for 2.5 before you spend a lot of time on this stuff.

    Like

  16. Sergeant Oreo says:

    The sheer amount of incredible depth and value your work has and is adding to Blender is so much fun to watch. Thank you Raul, and God bless!

    ~Sergeant Oreo

    Like

  17. Raul, I can’t find words to describe what you do, or what you are ! You are to Blender what Asgards are to Stargate SG1 (yes, I’m a fan of both SG1 and Farsthary !) : always bringing new amazing technologies to save the planet !

    Congratulations again and thank you !

    Like

  18. Lars says:

    Great stuff!

    *BUT* I see a problem – although I’m sure you’ll fix it real soon:

    In figure 7 and 8 – It is physically impossible for an illuminated object to be brighter than the light source. The colored bars are much darker than the areas just next to them that they illuminate.

    I’m sure its just a simple error in the light/intensity/exposure calculation.

    – Lars

    Like

  19. Lars- actually, it is possible- due to reflection. If you account for the ray bouncing off the emitter and the object a few times, at each intersection it picks up more energy. that is why it is far brighter just behind the emitter, where there is a reflecting diffuse object, and not as bright just in front of the emitter.

    in refraction (caustics), all the energy from one light source can be concentrated onto one spot and produce an ultra bright spot.

    Like

  20. Lars says:

    Ok, I guess there are exceptions.

    *BUT* the example renders were done with all diffuse materials, and what I say certainly holds true there.

    An object that glows a fairly dim red does not cause the nearest surrounding objects to look like much brighter hotspots.

    Like

  21. Rafael Borowski says:

    I’m one of those who expects for the Bom, Bonito e Barato!
    As i’m more an CG artist than a programmer i can’t help you,
    but i would like to.
    Amazing is your work!

    Like

  22. I see you don’t monetize your page, i think there is one opportunity to earn additional
    money on your blog, search in google for- idol4jp makes money

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s