I'm going to do a mirror thread here about my little RPG project.

The previous step was this:

Like I mentionned before I have the following map which is exported as an "unprocessed" json.
This time I added a second layer (rocks and flowers).

The idea next is to build a texture with the indexes of the map for each layer.
I'm not sure yet how exactly I'm going to encode the data into the RGBA texture (currently using 16bit pngs) because of how well I will be able to decode them on the shader size. 🤔

Since I'm planning to only store the indexes, the texture will be small.

This map of 10x10 square can fit in a 16x16 pixel texture for example. So I can easily have 2 or 4 layers side by side (so 64x16 texture).

I could also split the map in more clever ways horizontally as well to stay around the 4K limit.
Only downside is that I want an uncompressed texture in memory, to ensure indexes are preserved.

Took the time to fiddle a bit during lunch, and got the basics to have two layers side by side. :D
Numpy arrays seem to be put sideways, so I had to swap cooridnates.

Decided to do a little break from the map system and instead add a new postprocess pass (circular blur), because I never have enough of them, all while watching one of my guilty pleasure movie: The Core. :D

Adjusted the vignette mask for the blur and its radius, now I have something a bit more subtle.

Of course, I had to put some Chromatic Aberration in the blur pass. ;D
That blur pass is now the heaviest part of the post-process system, more than the bloom itself !

Alright, I always struggle with math but I'm finally making progress here. I'm finally able to draw my tiles index image in a single screenspace pass now, at the right size ! :)

Finally made it work !
I'm walking on my map, yeay !

I made only the base layer work for now, next are the other layers. Then it will be collisions.

Technically I ditched the idea of using a screenspace shader pass and went back to using dedicated polygons.
The nuance is that I precompute the mesh for each map, instead of using a generic mesh.

My python toolset parse the Tiled json file, build the mesh in a lua file, which I simply load at startup. And voilà !
UVs are directly set properly and I can use vertex color to get additional data.

For example I store in Red and Green the center of each tile so that I can scale it up/down to hide the tiles seam because of the texture sampling.
Texture are simple enough that I don't need padding in them at all.

The advantage of using Substance Designer for building textures: using the height-map as a blending mask to make terrain/tiles transition for free.

Each time I want to build a new transition I just need to plug two materials together in a filter. :D

Before moving on to collisions (which I have prepped the data for it already, now I need to translate it into the game) I want to play a bit with water surfaces.

Right now I built a new set of tiles with transparency. The goal is to render the map as a black and white mask and then generated a distance field from the hole.
From there I should be able to generated dynamic shorelines. :)

Took a few tries, but I got the Jump Flood working an now have a Distance Field texture working ! :D
Just need to cleanup the pipeline now and integrate it in-game ! \0/

Show newer

@TomF More like "where should that UV coordinate go" in this case. :D

Sign in to participate in the conversation
Gamedev Mastodon

Mastodon server focused on game development and related topics.