Skip to content

1 January 2026 - Zooming Through Progress

<<< Previous Post

You're on the latest post!


  Happy New Year! Sorry I wasn't able to get a post out for December, but as you can see, this post is a  l o n g   b o i. That's why it took an extra few days to get the post done. I've been busy.

Lighting and Thermal Simulation!

  To give a recap, lighting has been something I have been wrangling for most of the game's development on and off. A long while ago, in fact on the blog post from 31 May 2024, I outlined the S3DTex format, which is a packed texture that stores every rendered material property in one file. I still use this system and it has remained basically the same throughout the game's development. The problem is that at world scales, lights are significantly harder to work with.

Why is it so hard to get lights working well?

  Godot uses Forward Rendering. This means that each individual light is another render pass. First, all of the terrain draws without lighting. Then, each light in range is computed, one by one, until every light has been added to the final result.

  This is a pretty simple way to get lighting working and a lot of games use it for realtime lights. It allows lights to add up and result in super bright areas, and even allows negative lights to exist. The problem that you run into is what happens when there's a lot of lights. See, if someone fills a chunk with, say, nothing but light blocks, that's 4096 lights to factor in per pixel! Now imagine your poor GPU having to draw that same pixel and do advanced lighting calculations 4096 times in a row. For all 2073600 pixels (at 1080p). You'll be re-enacting this video in no time. You might as well start using frames per minute to measure your performance.

So what are the alternatives?

  1. Vertex lighting, which was ruled out pretty quick. This still requires going through every light, but rather than per pixel, it's per vertex, so instead of over 2 million things to calculate possibly thousands of times, it's only a few hundred or few thousand things.
    1. This still isn't great for performance and old GPUs would still struggle. It's definitely better, but...
    2. Vertex lighting has jagged shapes depending on the angle of the light, and...
    3. Vertex lighting doesn't support shadows, so you'd be able to see lights through walls
  2. VoxelGI. This is a global illumination system that bakes lights into a data structure, a bit like lightmapping, but way simpler and way faster.
    1. This worked for the most part, but still relied on lights (maybe I can bypass that?), so it's the same problem.
    2. This takes time to build, and a small lagspike/delay whenever a chunk changes is not ideal.
    3. Lighting data is large, and takes up a lot of memory.
  3. Stateless Voxel Propagation (my custom technique from the earlier blog post)
    1. Pretty good performance despite its problems and divergence, but...
    2. It's GPU only. Dedicated servers can't use it unless the server device has a GPU.

  This list may seem short, but I spent all that time testing each of these things. I really wanted lighting to work with full fidelity, but I just could not get any of these to work the way I wanted. I was about to just give up and just impose a light limit until I found a better solution... And then I remembered: There's a certain mainstream voxel game that has lighting that works in this exact scenario. Yep, it's Minecraft's lightmapper.

The solution (and its technicalities)

  The thing with Minecraft's lightmapper is that it's not real lighting, so it's a bit of a downgrade from what could have been. But, it's also got a lot of benefits:

  1. It's easy to compute, since lights don't add together for overbright areas. Slight visual downgrade and unrealistic light behavior, but oh well.
  2. It can be run on the CPU, so servers can do it.
  3. The lighting data can be baked and cached, so the renderer doesn't have to figure out the values every frame.

  With how much other games inspire my designs, you might be wondering why I didn't think of this earlier. Truthfully, I did, but I was quick to reject it. For all its benefits, Minecraft's lightmapper also has a few problems:

  1. Its method of calculation is prone to severe overloading. Several mods exist (Sodium, Optifine, a few others in that family) that optimize the lighting algorithm.
  2. It's just one shade of brightness, a single numeric value. No colors.
  3. Lights are limited to a range of 15 blocks.

  Frankly, this is kind of lame, hence why I abandoned it. But now that I have tried the other options, I realized that this might have been a good lead to pursue after all. But you see, those limits suck. So, I sat down, pondered and theorized for tons of restless nights (seriously, the amount of times I had trouble sleeping because I could not stop thinking about fast lighting...), and innovated on top of what already exists to bring something new to the table. I present to you: Parallel Floodfill Lighting. It's (probably) not my invention, but my lighting system has several aces up its sleeve:

  1. The system makes use of SIMD to do its computations with a significant efficiency increase.
  2. On top of this, the system is multithreaded for even more performance.
  3. The system supports RGB lights. It currently limits to 15 blocks like Minecraft simply because I subtract that much from its brightness, but theoretically lights can have any range they want.
  4. The system supports colored shadow casters. Light shining through colored, translucent materials will come out colored on the other side!
  5. The system supports florescent voxels. Voxels can glow and emit their own light in response to being hit by existing light!

This image shows a flat area of steel blocks lit up by three light sources in three of the four corners. The left-most source is blue, the center source is green, and the right-most source is red. The lights shine over each other, mixing the colors to form a rainbow. This image shows a flat area of steel blocks lit up by a white light source. The light source is encased inside of translucent blue blocks, and the resulting light color around the blocks is that same blue color, demonstrating how transparent materials can color light sources.

  I'm quite proud of being able to pull this off. As you can see from the second image there's a few bugs, though admittedly the bugs are caused by the fact that I don't actually have surface lighting implemented yet (solid voxels are transparent so that the light value inside of them isn't affected. The light value inside the voxel is what gets applied to it, rather than the light in the direction of the voxel's surface).

  The best part about all of this is that my light solution also empowers the advanced temperature mechanics I was hoping to get. As a reminder, I had a goal of implementing a very crude thermal dynamics system in The Conservatory (and by "crude" I mean it's dumb. It's not meant to be accurate, it's just meant to loosely spread temperature). To achieve heaters and coolers, temperature is literally just a fourth light channel. Heat is light. The solution is so comically simple yet so genius at the same time. But then, as if that wasn't enough, this also ticks the box of a pipe dream stretch goal I had which is that the Novan will actually be able to see heat (or I can implement thermal goggles or something).

  By virtue of how the lighting system works, and how chunk updates are deferred and run in the background, it's totally possible for that age old joke I cracked to be real. You might just be able to put a portable sun on a planet and heat the whole thing up. Nice. Anyway, the main caveat to this temperature system is that things can't "warm up", there is no notion of thermal density and energy transfer, since it behaves like light rather than like actual heat. Still, this is very awesome and I am genuinely so stoked to have it available to me. Better still, it would be possible to implement blackbody radiation. Voxels that are above a certain temperature might be able to use the stellar color function to glow red hot and emit that light. Realistically I probably won't implement this (or I might. We'll see) because the function is designed to be perceptively accurate and thus the computation is quite expensive, but I think it's really cool that the possibility is there.

Nodeless CharacterBody3D

  A pure C# implementation of Godot's CharacterBody3D now exists for entities. It works a bit differently, as an isolated box with some additional settings (like stairstep collisions). This is a huge upgrade to the old manual entity physics system, but also means some changes were done. Primarily, entities are no longer rigid bodies (they still can be, but the default is now kinematic so that they can use this system). This dramatically improves the feel of controlling your character, all while being compliant with the nodeless design of entities. This also makes implementing physics on custom entities way easier since CharacterBody3D was literally designed for this one purpose.

  As mentioned, this has some additional settings, because a 1:1 port of CharacterBody3D would not be ideal. Notably, these features are now available on top of the base functionality of CharacterBody3D:

  • Step height, which allows entities to walk up stairs and slopes instead of getting stuck on them.
  • Smart floor snap height. CharacterBody3D already has floor snapping (this improves collision detection with the ground), but this new technique has context of the character jumping and can be used to go down slopes or elevator platforms without bouncing or shaking.
  • Ceiling platforms (previously only floor/wall were available), which will allow some entities to grip to ceilings.
  • Physical properties (bounce, friction) now function.
  • The transform of the simulated character body is separate from its body, allowing simulated movement without actually moving the physics object.

Ores

  One of the things that was on my to-do list but never actually implemented for the longest time was ore veins. There's a lot of ways I could have gone about doing this, from dedicated ore vein blocks, to entities. The system I ultimately decided on is the most versatile, where any voxel can have ore attached to it as a piece of metadata. This is actually pretty common in survival games as far as I know, with Minecraft being the outlier. As far as how ores render, I opted to do it a bit like SkySaga: Infinite Isles where ores create physical, 3D "nodules" that jut out of the block. This looks way better than any texture-based solution, because it adds character to the ores by providing shape and other physical properties. It also allows you to see ores through very thin seams in walls, which feels better from a gameplay perspective I think.

  To render the ores, I use a fast instanced rendering technique where each ore provides a snapshot of possible geometric "adornments" that can be added to the surface of voxels. These adornments are randomly selected and placed on voxels in a chunk that have the applicable ore, making each ore block have its own tiny variance to it. This also allows ores to implement custom shaders and visual effects separately from the voxel.

  In the way of actually mining ores, there is no such thing as a mining level/tier like Minecraft has. Instead, ores have the option to modify the health of a voxel, making it take a different amount of time to break. This might make higher level tools a better choice, but they will never be completely required. Additionally, the limitation you face with the ore is actually the technological prowess needed to use that ore in crafting. In essence, it's completely possible for you to find an end-game ore really early in your run. You just won't be able to use it until you know how.

Geometry3D Upgrades, Nonphysical Collision Tests, Voxel Morphology

  One of the things that I quickly realized I needed is spatial queries that don't use the physics engine. This can be done with physics, but there's a problem: One optimization I make to chunks is deleting collisions for chunks you can't touch. This means every chunk is actually hollow on the inside (as far as physics is concerned) and so if you were to raycast or do a shape test inside of a chunk, it wouldn't hit anything!

  To fix this, I had to add the ability to query intersections with Shape3D instances without using physics. This is done mostly in the C# class PhysicsServerExtensions, and offers two checks: Point, and AABB. Point collision checks for a single point in space overlapping with a collision shape, and AABB checks an axis-aligned bounding box. The thing is, getting this to work with most shapes was easy, but convex hulls were a nightmare to get working. To fix this, I had to add new API to Geometry3D to get an array of Plane instances from a point cloud's convex decomposition. Using this array of planes I can check for collisions with convex hulls. It's fairly complicated and so I won't try to describe it, but the takeaway is that I had to do a lot of work to get this working, and frankly I'm glad it's over and done with.

  Another change I did with physics was to one of the more niche voxel shapes, SlopedConcave. For perspective, I have four "morphologies" for voxels:

  • FullCube is always a full block regardless of the shape data, unless the shape data is empty.
  • Rigid subdivides a voxel into 2x2x2 cubes, giving the look of something like Minecraft stairs or slabs.
  • SlopedConvex does the same, but uses wedge shapes, allowing the creation of slopes instead of jagged edges. It is convex because if opposite corners are filled, geometry is added to bridge the gap between the two faces. A bit like covering the voxel's geometry in shrink wrap.
  • SlopedConcave is almost identical to SlopedConvex, but unlike it, no bridging geometry is created. This means if you do opposite corners, you get two tetrahedrons in either corner rather than a diagonal tube-ish thing like in the convex version.

  Prior to this change, SlopedConcave used concave colliders. This isn't great, because concave collisions are some of the hardest to compute in any physics engine, and moreover, they are hollow. Clipping inside of a concave collider is much easier than all other shapes. On top of this, the physics query mentioned before doesn't support concave colliders, they are just too complex to handle. So, to fix this, I implemented Convex Decomposition. This adds some API to Mesh that allows decomposing a concave shape into many convex pieces of that shape, all without actually having to create a new mesh node.

  Note that while this does use Godot's HACD library, the geometry for terrain is never approximated and thus retains full accuracy. Ultimately, this results in more physical collider shapes, but those shapes are simpler to compute and aren't hollow. They also support the collision query system mentioned prior. Best of all, this is all baked ahead of time for you during startup, making it work silently in the background. Hooray for free performance!

Research

  Research is one of the game's core mechanics, as integral as something like crafting. I've mentioned it in the past, but a big detail is that I am hugely inspired by the research system for Frackin' Universe. To be frank, my system is almost an exact replica. I've changed some details of how progression works, but the premise is almost identical.

  In essence, to ensure that you learn how to play the game organically, using higher level materials requires you to research those materials first. Vitally, you can still express your freedom and get rewarded for higher risks by finding and collecting the materials long before you can use them. The game won't stop you from doing that. However, to unlock a research node, you must have its predecessors unlocked, and then must have the relevant researchable items in your inventory. Researching them won't consume the items, they are just there as "proof of labor" for lack of a better term. Using stuff too early might tarnish your experience of the game, especially if you miss a bunch of content and quests because of it. That is why this system is so important.

A preview of the game's research menu. It has a few connected nodes, and then on the right side of the image it has some placeholder text and menu elements describing what the research is about, what it unlocks when you complete it, and what items you need to find in order to complete it.

Godot 4.6 IK Backport

  Godot 4.6 is slated to finally reimplement the IK system that was removed in 4.0. It was removed due to it being too inconsistent and janky. If you aren't aware, IK is the system that lets characters bend their body parts without animation, like when an NPC reaches for something, IK is the math that says "Okay, bend your shoulder like this, then your elbow like this, ..."

  The Conservatory already has a very primitive IK system I designed almost a year ago, which supports FABRIK and CCD, the two metaphorical "Hello World"s of IK algorithms. Godot, on the other hand, plans to go as far as implementing something called "Jacobian IK", which is one of the more (most?) complex algorithms. The fact that it's included here is incredible and I am very happy to see it as an option. On top of this, it comes with a handful of IK modifiers which can affect how the solver works to prevent issues like twisting and tangling, and with all of that, it's a fully functional IK suite that is on par with, if not more usable than, something like what even Unreal Engine has. That's insane for a free engine like Godot. If you thought dealing with transformation matrices was bad, you'll faint at the sight of the maths required for Jacobian IK. Thankfully, most of that math is hidden away, and using it is as easy as setting up a few nodes.

  That said, most of my use cases for this are for simple world objects and NPC interactions. While I don't think I'll need the full complexity that this offers, I'll gladly take it. Like, what kind of a fool would I have to be to decline a direct upgrade to what I have now? For this reason, I plan to backport it and the updated SkeletonModifier3D changes into Godot Engine: Conservatory Edition as soon as 4.6 is released and tested.

SHISHUA and New Procedural Generation

  SHISHUA is the name of the 2025 record holder for the fastest pseudorandom number generator in the world (and as of writing, it is unbeaten). Previously, I used the Xoroshiro256+ algorithm which is already comically fast, don't get me wrong, but it sucks at bulk data generation compared to SHISHUA. As a largely parallel algorithm, SHISHUA gives you way more bang for your buck, which is useful for terrain generation. This randomizer is so fast, that my world generator is holding it back. Think about that. The code which generates the data I need runs so bloody fast, that it has to wait on me. That is insane! If you want scale, SHISHUA's parallelism allows it to generate 1 byte of data per 0.06 clock cycles. A single iteration of the randomizer generates 128 bytes of data (compare that to the 8 bytes of Xoroshiro256+).

  Anyway, the main reason this gets a special section is because my port of the library into C# is open source. You can get it here: https://github.com/XansWorkshop/SHISHUADotNet/.

  Now, incredible qualities aside, the actual difference it makes to generation speed varies, because unlike Xoroshiro256+, it is hardware dependent. If your CPU doesn't have AVX2 support (which most modern ones do, I think something like 97% of Steam users have it as of the last hardware survey), the system has to fall back to using the older and less capable SSE2 extension, which will cut its speed in half. Still, even at half speed it's "only" 4x faster than the old algorithm. Not much of a downgrade, and every 64-bit computer has SSE and SSE2 (like, it's required by spec).1

  The primary reason I chose to go with SHISHUA is because of the quality of its output, not so much its speed. Xoroshiro256+ is good and it's fast, but it has problems with seed correlation and output resilience. The side effects of this are that it is, for lack of a better phrase, "less random" than the new system. Due to the amount of data that The Conservatory needs to generate, having a strong randomizer that works at that scale is important.

Accessibility

  The Conservatory has been updated to include the Godot AccessKit hook-in. This is a bit of an odd "plus", since base Godot comes with it, so it'd actually be a direct downgrade if I didn't have it. The thing is, I had to add support for it myself since I am using a fork of the engine, by providing it with the necessary libraries. It's not a feature out of the box...

  But I digress - accessibility is important to me. I can't promise complete accessibility compliance, but I've done a significant amount of work and implemented most things. I use https://gameaccessibilityguidelines.com as my frame of reference, since this is what is often recommended by Godot and its users. If you are curious, I have created a checklist on the blog where you can track my progress.

New Modding Documentation Site

  Something I have touted for my modders is the complete open nature of the game's internal documentation. To do this, I used a custom generator (so no DocFX, Sandcastle, whatever else) built on top of the same system that this website operates upon, MkDocs with Material. MkDocs is a system to turn markdown pages (a specific type of text format, the .md format, Discord actually uses it for its text formatting) into HTML with a bunch of nice features. Material adds the fancy color scheme and navigation/search stuff.

  The problem is that MkDocs (and by extension, Material) is slow. Like, I can export the game for its release build faster than MkDocs turns text into slightly different text. It's abysmal. So imagine how I feel when I see a slight bug in my documentation webpage builder, or a grammar problem, and have to wait 5+ minutes to change a single letter. Yeah, I got fed up.

  Now, I use a custom thing that looks and feels exactly like this website here, but which builds the HTML using my own tool entirely. That 5 minute wait has turned into about 15 seconds, and I have much more precise control over exactly how it looks. I'd love to show it off, but I'm a bit hesitant about that since the game is still in its beta and thus a lot is going to change. Throwing documentation at you now would not be wise. Still, here's an example:

A picture of the modding documentation webpage for VoxelMaterial, the game's type representing a type of voxel that can be in the world.

  Let's break this down. The page is about the VoxelMaterial class, showing what it is, where it is, and what it extends/implements. You can see it has documentation including the little bubbles (called "admonitions") to bring attention to things of particular note. Below that, you can see the Constants and Fields sections, which have a note, "Nothing in this category is public. You can choose to view private and internal members anyway, by checking "Show Private/Internal Members" at the top of the page." As it implies, and as you can see, there are three checkboxes at the top of the page which customize what you see.

  Perhaps the most visually interesting thing on the members below, however, are the "Allows" and "Prevents" notes below each member. This has to do with the implementation of CasCore, which I covered in the blog post on 2 September 2025. You have access to a trimmed version of C# reflection as well as a limited version of Harmony, as mentioned in that post, and some members have limited access depending on their nature and what they can do. Any attempt to perform a prevented operation will crash the game with a SecurityException, preventing malicious mods from wreaking havoc on things or accessing your computer in unintended ways.

GDScript Security

  When thinking about mod code security, I realized something: GDScript, Godot's native scripting language, is significantly less dangerous than C# in terms of potency in the hands of a hacker. Still, it has some vectors of attack. For this reason, mods made using GDScript are not allowed, and the Script class in C# is locked behind CasCore. I doubt this will be much of a problem, since most of the game's API isn't even exposed to Godot in the first place, but I figured I should mention it anyway.

  If you are curious, some vectors of attack include:

  1. GDScript can access several Godot types which are restricted in C#, like FileAccess and ResourceSaver2
  2. The only way to limit GDScript security is to implement it in the engine itself which is going to be a spaghetti nightmare.

  And also note:

  1. Attempting to include a script in your mod's assets will cause your mod to fail during the asset query phase, crashing the game.
  2. Attempting to instantiate a script from C#, or modifying Script.SourceCode, will raise a SecurityException and crash the game.

Other Loose Ends

  And finally, the miscellaneous bits! At the start of the post, I said that I have been busy. On top of the changes listed above, there's all manner of tiny tweaks and additions that are just too small to get their own section in the blog post. Here's those changes, for the curious readers:

  1. PhysicsServer3D now has the IsBody, IsArea, IsShape, IsSpace, IsSoftBody, and IsJoint methods, which can be used to validate Rids passed into various methods. This is used extensively in objects belonging to The Conservatory to raise meaningful exceptions when you pass in the wrong kind of Rid.
  2. World files on disk now use a better packing format, reducing their size whilst decompressed. Unused/unloaded worlds can be compressed to save even more space, at the cost of also taking longer to load/unload. Compression level (zlib) is an engine config.
  3. Creative mode controls are basically done, and I've got a noclip feature built in too.
  4. 3D billboard GUI is now available. This is different from surface GUI, but that exists already anyway, so both of them are there and available to be used.
  5. Debug boxes are now available, which are wireframe 3D boxes that can be drawn arbitrarily in 2D GUI code.
  6. I've added a tool which can save chunks to .tscn files for the Godot editor. It's mostly for me, since it helps me debug my shader and whatnot, but it may be useful to you too.

Closing Thoughts

  I'm not going to lie, writing this blog post took several days. Admittedly, a significant portion of the delay was caused by the fact that I wanted to have screenshots to share (especially of the new lighting engine), since those are quite a rarity here on the blog. It takes so much time to proofread posts this long, and to make sure that they are structured in an interesting way, so I'm pretty worn out after this one.

  In the way of the game's development, things are still chugging along. I still have some bugs to fix with lighting as mentioned, but aside from that, the only things I have left to do are implementing NPCs, dialogue (already done, kind of), and quests (also already done, kind of). I've also got to implement environmental effects (including weather and temperature mechanics), proper handling of player death, shipbuilding and space travel (and the stellar map), and other nuts and bolts that I'm too tired to think of right now. The important thing is that once those things are done, the functional part of the game itself is basically done too.

  This game has taken way longer to make than I would have ever thought, yet it has also gone by so quickly. It's approaching its second year in development, which given the scope of this game is an incredible speed for a one-man-band, especially when that one-man has never made a game like this before. I can see the light at the end of the tunnel, and I'm pretty sure it isn't a train this time. At least, I hope it's not. Jokes aside, when people like you really sit down and put this kind of time into following my work, it makes me truly grateful to have you around. You are the reason to keep going, to keep posting and being passionate about my work, the reason for me to put so much effort into this silly space game that came around because I wanted to play 3D Starbound. So thank you, truly. Here's to hoping that 2026 isn't a complete flaming shitshow for us folks (especially us queer ones) here in the USA and abroad, yeah? Cheers, and remember to drink your water. o7


  1. If you were to theoretically get this running on ARM, it may be able to use the respective hardware intrinsics for that chip. I have no way to test this myself, hence the note here. 

  2. These classes are not restricted to accessing only user:// and thus may be able to edit any file on the host system. Obviously not very good.