-
Notifications
You must be signed in to change notification settings - Fork 16
/
Copy pathREADME.txt
224 lines (181 loc) · 10.8 KB
/
README.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
Gradient-Domain Path Tracing / Gradient-Domain Bidirectional Path Tracing
-------------------------------------------------------------------------
This code extends Mitsuba 0.5.0 and implements the algorithms presented
in papers "Gradient-Domain Path Tracing" by Kettunen et al. and
"Gradient-Domain Bidirectional Path Tracing" by Manzi et al.
The algorithms internally first render gradient images (differences of
colors between neighboring pixels), in addition to the standard noisy
color images. This is done by variations of standard unidirectional and
bidirectional path tracing as described in the aforementioned papers. They
then solve a screened Poisson problem to find the image that best matches
the sampled gradients and colors. This typically results in much less
noise.
The algorithms use the same path sampling machinery as their
corresponding standard methods, but in addition estimate the differences
to neighboring pixels in a way that typically produces little noise. Using
the same path sampling machinery means that if the corresponding
non-gradient method is absolutely terrible for a given scene, making
it gradient-domain probably won't be enough to save the day. But, as we
demonstrate in the papers, assuming that you have a scene for which the
basic method works, making it gradient-domain often saves very much time.
As described in the papers, the L2 reconstructions are unbiased, but
sometimes show annoying dipole artifacts. We recommend the slightly
biased L1 reconstruction in most cases, as the L1 images are generally
visually much nicer and the bias tends to go away rather quickly.
By default, reconstructing the final images from the sampled data is
done on the CPU for compatibility. We recommend using the provided CUDA
reconstruction when possible. See the instructions below.
The code was implemented and tested using Visual C++ 2013 (with update 4)
and CUDA Toolkit 6.5 and 7.0. Linux and Mac OS X support might require
more work.
The integrator implementations are released under the same license
as the rest of Mitsuba. The screened Poisson reconstruction code from
NVIDIA is under the new BSD license. See the source code for details.
Project home pages:
Gradient-Domain Path Tracing:
https://mediatech.aalto.fi/publications/graphics/GPT/
Gradient-Domain Bidirectional Path Tracing:
http://cgg.unibe.ch/publications/gradient-domain-bidirectional-path-tracing
In case of problems/questions/comments don't hesitate to contact us
directly: [email protected] or [email protected].
Features, Gradient-Domain Path Tracing (G-PT):
----------------------------------------------
This implementation supports diffuse, specular and glossy materials. It
also supports area and point lights, depth-of-field, pixel filters and
low discrepancy samplers. There is experimental support for sub-surface
scattering and motion blur.
Note that this is still an experimental implementation of
Gradient-Domain Path Tracing that has not been tested with all of
Mitsuba's features. Notably there is no support yet for participating
media or directional lights. Environment maps are supported, though.
When running in the GUI, the implementation will first display the
sampled color data. When rendering gets to 100%, it then reconstructs the
final image with the given reconstruction method. The gradient and color
buffers are written to disk, so for example NVIDIA's screened Poisson
reconstruction tool may be used for playing around with reconstruction
parameters at will. For timing the method, note that if multiple render
jobs are queued in Mitsuba, Mitsuba will start the next render job
without waiting for the reconstruction to finish, slowing it down.
This implementation does not yet support the 'hide emitters' option in
Mitsuba, even though it is displayed in the GUI!
Features, Gradient-Domain Bidirectional Path Tracing (G-BDPT):
--------------------------------------------------------------
This implementation supports diffuse, specular and glossy materials. It
also supports area and point lights, depth-of-field, motion blur and low
discrepancy samplers. Note that currently only the box pixel filter is
supported. When rendering has finished, the implementation will solve and
show the L1 reconstruction in the GUI. However, the L2 reconstruction,
the primal image and the gradient images are also written to disk.
Note that it is still an experimental implementation that hasn't been
tested with all of Mitsuba's features. Notably there is no support yet
for any kind of participating media. Also smart sampling of the direct
illumination is not implemented (i.e. no sampleDirect option as in BDPT).
Installing:
-----------
- Download the dependencies package from the Mitsuba Repository
https://www.mitsuba-renderer.org/repos/ and extract it into the Mitsuba
directory as 'dependencies'.
- If you want to use faster GPU reconstruction with CUDA, extract
gradientdomain_dependencies_CUDA.zip and follow the instructions in
chapter 'Faster reconstruction on the GPU' below.
- Scons requires 32 bit Python 2.7. Newer versions do not work with
scons. Add python27 directory and python27/script directory to VC++
directories / executable directories.
- As mentioned, scons is required.
- Pywin32 is also required.
- Compile Mitsuba with Visual C++ 2013 or above.
Troubleshooting:
----------------
- Make sure that the config.py files use DOUBLE_PRECISON flag instead
of SINGLE_PRECISION since gradient-rendering is very sensitive to the
used precision. This will hopefully be fixed at a later time.
- To create a documentation with doxygen run the gendoc.py script in
mitsuba/doc. If this fails it might be because some packages of latex
are missing (especially mathtools). (With miktex install them with
the package manager admin tool (under windows the tool can be found
in MiKTeX\miktex\bin\mpm_mfc_admin).)
- In case of problems, remove database files like .sconsign.dblite,
.sconf_temp and everything in build/Release and build/Debug.
- Adapt the used config.py file in NMake.
- In case of 'Could not compile a simple C++ fragment, verify that
cl is installed! This could also mean that the Boost libraries are
missing. The file "config.log" should contain more information',
try bypassing the test in build/SConscript.configure by changing
137: conf = Configure(env, custom_tests = { 'CheckCXX' : CheckCXX })
to
137: conf = Configure(env)
- We noticed that if we let OpenMP use all CPU cores for the
Poisson reconstruction, OpenMP may end being slower than a
single core reconstruction. We suspect this to be caused by hyper
threading. Our workaround is to use only as many threads as there
are physical cores. If you experience random slowdowns in the
reconstruction (on CPU), please decrease the number of used cores in
src/integrators/poisson_solver/BackendOpenMP.cpp by one or two.
Usage:
------
- The rendering algorithms are implemented as integrator plugins
"Gradient-Domain Path Tracer" (G-PT) and "Gradient-Domain Bidirectional
Path Tracer" (G-BDPT). They can be chosen directly in the Mitsuba
GUI, or by setting the integrator in the scene description files to
"gpt" or "gbdpt".
- If running from the command line, be sure to set the scenes to use
"multifilm" instead of "hdrfilm", since the methods render to multiple
buffers simultaneously.
- Note that while rendering is in progress, what is displayed is only
the sampled color data. Reconstruction of the final image is done
only after the color and gradient buffers have been sampled.
Faster reconstruction on the GPU:
---------------------------------
We recommend using the provided CUDA reconstruction code for faster
reconstruction using the GPU. To compile with CUDA reconstruction,
unzip the file 'gradientdomain_dependencies_CUDA.zip' and overwrite
any files when prompted.
You additionally need to have the CUDA toolkit installed (at least
version 6.5) with a suitable NVIDIA GPU, and, you need to *manually*
copy the static CUDA runtime library cudart_static.lib (which is
provided in the CUDA toolkit installation folder CUDA/vx.x/lib/x64)
into folder dependencies/lib/x64_vc12.
Finally you need to rebuild the solution.
To switch back to CPU reconstruction, unzip the file
'gradientdomain_dependencies_DEFAULT.zip', overwrite any files when
prompted and rebuild the solution.
The CUDA reconstruction library is compiled for win x64 using
VC++2013. To rebuild the CUDA reconstruction library, get the source
release from the project home page of Gradient-Domain Path Tracing.
Change log:
-----------
2019/03/01: Fix a bug in the environment map shift.
2017/06/28: Improve G-PT's config serialization. Fixes network
rendering.
2015/12/18: Fix vertex classification for perfectly specular material
components.
2015/10/06: Improve G-PT's vertex classification for multi-component
materials.
2015/10/05: Fix handling of specular materials in G-PT. Fixes glitches
in scenes Bottle and MatPreview.
License Information:
--------------------
All source code files contain their licensing information.
Most notably, unlike Mitsuba code, the screened Poisson reconstruction
code is NOT covered by GNU GPL 3, but the following license:
Copyright (c) 2015, NVIDIA CORPORATION. All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
* Neither the name of the NVIDIA CORPORATION nor the
names of its contributors may be used to endorse or promote products
derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL NVIDIA CORPORATION BE LIABLE FOR ANY
DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.