The first ones use a nearest-pixel filter and the second ones use a linear filter.
The only noticeable difference here is the height-coloring gradient.
I’m now using libnoise as the base for generating terrain data.
And I mapped it to a sphere because why not?
I’ve been playing around with alternate ideas for terrain generation, and I just made this.
I now have world editing and multi-threaded chunk building. Progress!
Still using the hard directional light, but here’s a larger world.
Chunks don’t render as fast as I’d like them to, but they work. I think I know why (they all share vertex buffer objects.)
I think I made my GPU cry.
Now I have tested and stable ray tracing on the C++ side. Time to port to GLSL.
Still using my hacky shadow tracing method at the moment, but notice that I added in a Lambert calculation for lighting and determining if rays need to be cast in the first place (removes roughly half of the required rays.)
I’m going to have to write an article on this, but here are some screenshots to show what I’ve been working on.
Pixel-based tracing. Notice the jagged edges because of my very rough tracing. I’m not going to use this method, but I copy-pasta’d my vertex version just to see how it would look.