Comments

Log in with itch.io to leave a comment.

Would you be able to explain how the conversion of the default sprite to its indexed format is performed? Lorikeet looks like a great tool and I wanted to integrate into a project I'm working on, but I'd like to create an "all-in-one" shader that first performs the index color conversion on a sprite's pixel and then calls the "GetLorikeetColor" function on that converted pixel value to get the correct palette value. Basically, I'm trying to do everything at runtime instead of using the editor to convert my base sprites to their indexed versions first (there are a quite a lot of sprites in my project and it would take a while to do that for all of them).

I tried combining the Lorikeet shader with the standard NTSC accurate grayscale shader on GameMaker's website, but I quickly realized that isn't what you used to perform the index conversion, so it didn't work. I'd appreciate some insight!

Getting the palette of a sprite at runtime isn’t something Lorikeet does (nor do I have any interest in doing so), though if you really want to do that there’s this one that Jon made. The color index is stored in the grayscale value. Look into how old video game consoles stored sprite color before the days of RGBA.

If you want to apply an additional effect after the palette lookup, modify the final gl_FragColor value.

I get that the color index is being stored in the grayscale value. I was just curious how the grayscale value itself is being calculated. I'm definitely not asking you to do any more work with the shader, but I figured that if I knew what calculations were being performed to get the indexed grayscale version of the sprite then I could write some additional shader code myself to do that conversion before calling the lorikeet shader code.

If that's not possible (or overly difficult) I could definitely look into using PixelatedPope's system instead.

The colors are indexed by counting all of the (unique) colors in the image and putting them in order. This can’t be done in a shader and has to be done ahead of time, but mapping the color onto the index can be done in O(1). Jon’s works by searching the palette for a matching color (or an almost-matching color), which is much slower but is usually good enough.

I just realized I never replied to this. Thank you for taking the time to offer clarification, I really appreciate it.

(1 edit)

Hi, I'm sorry to bother you again but I was doing some more work with Lorikeet and had a quick question. In the game I'm developing, my player character's sprites are separated into different strips based on animation (i.e. idle, jumping, running, etc.). Because of this I was planning on exporting each strip individually, importing it into lorikeet to get it's indexed equivalent, and then reimporting the indexed sprite back to GameMaker.

I noticed, however, that despite the fact that all the sprite strips share the same color palette, lorikeet has been giving each one I import a completely unique indexed equivalent which I wasn't expecting.  You can see it in the following pictures:





I was wondering, when making the indexed version of the duck sprite in your demo project, did you index an ENTIRE singular sprite sheet for the duck and then cut it up later in engine? Or did you have all the different animations as separate sheets and index them individually (like I was trying to do). I have a sinking feeling you did the former, which would mean the only way around this would be for me to export all the animations in my game, consolidate them to a single sheet, index them, then reimport and cut the sheet in-engine.

I did, yeah - though if it makes it any easier, I also made this little program to make cutting sprite strips into individual images faster: https://drive.google.com/file/d/1xlhksRYzESCeLnZEYLgLJDbIF53uRDay/view?usp=sharing