Yesterday I went to pick up my iPhone. Unfortunatly I will only have full internet access January the 8th. I noticed today that I already have internet but I turned everything off. I’ll have to look into that. I don’t want to run in a lot of costs!
I continued working on the water of PowerRower. Previously it was just a big plane that used shaders to calculate the uv coordinates of the projected reflection texture. It was a very cheap hack but it was quite ugly. It was not possible to do reflections without shaders, the scene looked quite static and was full of artifacts. Also I did not get the motion of the water the way I wanted it to look.
I started working on a completly new system. I am designing the system so that it is able to to the following:
- Must work with fixed function pipeline.
- Must be able to do reflections/refractions.
- Should work at very high framerates.
- No noticable artifacts.
- Realistic looking wave motion (FFT).
In the system I created an interface to represent heightmaps that can store the data in memory or in a texture. I also created an interface to calculate normal maps for from the heightmap interface. Further more I created an interface to represent the geometry of the water. This interface takes care of LOD and frustum culling. I did not implement any of these optimizations yet. The water geometry is now only a grid of quads. For testing purposes it works pretty well. I will post pictures as soon as it starts looking impressive.
After implementing spotlights, we are now done with BumRay. At least for this practical. In the second practical we’ll probably have to do textures so that will look even better than what we have so far. I am also planning to implement true global illumination through either path tracing, irradiance caching or photon mapping. I will have to look into that.
I also quickly created motion blur in the C++ raytracer. The results (without propper shading) look quite good. Here is an example:
The image rendered in about 6 seconds with 64 samples per pixel.
Fortunatly Yorick forced me to get back to work on PowerRower, so I started to do the water. Thank god for people like Yorick, because without them we’d never get anything done.
Just thought I’d post some images of our raytracer for the course Graphics. We went a little further than was required. The entire raytracer is written in Java.
This image took 20 minutes to render with 196 samples per pixel.
Source will be available at the end of the semester. So stay tuned!
It has been awhile since I’ve posted something. Fortunatly I have been busy in the months I haven’t posted. First of all I’ve have been working fulltime for electrabel netherlands. In my spare time I have been working on PowerRower. We decided to ask another developer to join the team and our eyes immediatly fell on Marries. We asked him to join the team and so he did.
I have also been busy looking for a place to live in Utrect. I’ve been to some places already but I didn’t get any luck. Maybe I’m just to picky, I only want to live on the Uithof because its cheap, they’ve got 100mbit internet there and it is very close to my study.
I’ve also been busy for my new course: Graphics. As I enjoy graphics the most of all computer science areas I started a little early with my practical. We have to writer a simpel raytracer in Java. I completed the assignment in one afternoon while we’ve got a month to finish it.. Fortunatly there is lots of things I can add. I started out by making the whole thing multithreaded, I added antialiasing techniques and for fun I also added ambient occlusion. (Sorry for the bad quality compression)
My partner, Yorick, added an image pipeline to do some fancy post processing. We are aiming for the highest grade here.
I’ve gotten a mail from someone who was interrested in my post on GameDev.net about HDR(high dynamic range) combined with DOF(depth of field). I love to see people use my ideas in their programs.
Also, mostly because of this course, my interrests for offline rendering reawakened. I decided to do a project in area. I am thinking about creating a fantastic renderer for public eductional use, more on that in my next post.
Finally I joined a students union for rowing: orca. Its a bit weird to work on a rowing program when you have nevered rowed in your life before! And ofcourse I need the exercise. I had my first training on monday and I enjoyed it a lot, mostly because my team is very good.
Next time I hope to tell you more about this renderer of mine!
and your done. I for one like it a lot (when its done )
Node based material system
I have been working on my node based material system and I am quite pleased with it. I have had a few questions on what a ‘node based material system’ actually is so I am going to try and explain it here.
To render (draw things to the screen) things in 3D a lot of things need to be done. First of all you need something to render. Things you can render consist of points and triangles that connect to points to create a solid surface. Most of the time I am using a model of a teapot. Why? I dont know, mostly because I think it fits well as a testing model.
The renderer need to know how to convert a 3D point to a 2D point on your screen. You can do this by multiplying your 3D vertex(point) with a matrix. This matrix transforms you 3D point to where it should be drawn on the screen. Then it is time to color the model. This can be done using just colors or with textures(images).
For the purpose of allowing nicer looking models through materials shaders were invented. Shaders are small programs that run on the GPU (graphics processing unit, something like the CPU but for graphics). They allow a programmer to make very complicated, and especially good looking, materials. The shaders needs to do all the stuff described above (transform the vertex, color it, etc.) The shader can consist of two types of shaders: pixelshaders and vertexshaders. Pixelshaders work on pixels and vertexshaders work on vertices (plural of vertex). In the vertexshader you need to output the screenposition and optionally some other things to pass along to the pixelshader. The pixel shader needs to output the color of a pixel based on stuff it received from the vertexshader (or on nothing at all). A very simpel shader is shown below:
float4x4 matWorldViewProj : MATWORLDVIEWPROJ;void vShader(in float4 vertexPosIn : POSITION, out float4 vertexPos : POSITION)
// Calculate the screen position of the vertex
vertexPos = mul(vertexPosIn, matWorldViewProj);
void pShader(out float4 color0 : COLOR0)
// Output the final color
color0=float4(1.0, 1.0, 1.0, 1.0);
Which results in:
This is not something non-programmers would like to do especially when it comes to lighting and other complicated stuff. Unreal Engine 3 and Project offset both use a more advanced way to create shaders. You can use blocks which all ‘do something’((phong lighting block, texture, color, functions, etc) and link them together to give you the final result.
For my PWS I also created such a system but it was very buggy and most important of all it was static as hell. (Adding new things was very very very very hard)
Anywho, this week I started coding a new node based material system. I takes 2 XML files; one describes all the different modules and the seconds describes how these modules are linked to each other. I'm not very happy with the layout of the xml link file but I can work that out.
The shader compiler reads the files and outputs the shader. It is not yet in a state I can show pretty flowers but I like it a lot. When the system is in a more complete state I will also make a GUI(graphical user interface) for it so you can create shaders with just a few mouse clicks. Even my mother can create shaders then!
I will keep you posted with more results.