You may notice that something seems different around here. Today I migrated my blog from that crusty old Typo to the shiny new Mephisto! Most things are working well, but excuse the dust.
You might also notice that the Rubygame posts have vanished mysteriously! Well, in fact, they have been moved to…the new Rubygame blog!
I've been sitting on the rubygame.org domain name for a while now, but I was never sure what to do with it. But then I figured, “Hey, let's put all the Rubygame news on that domain, and use jacius.info for my other projects and personal stuff!” Evenutally, I'm also going to set up a copy of the documentation and downloads on rubygame.org, as well.
I don't think I've mentioned here how frustrating RDoc's C parser can be. It's almost up there with SDL_mixer on the headache-o-meter, I swear.
I'll admit, the idea of automatically generating documentation from the code structure and comments in the source code is neat. And RDoc does a pretty good job of figuring out the structure of Ruby code. It's pretty smart at that, and it can figure out which class is located under which module, and which class in the parent of which subclass, and which mixin is included in which class, etc.
But it's not so great at figuring that stuff out in the code for extensions written in C, like Rubygame.
I had an interesting idea for a musical sound-generation toy, inspired by Electroplankton. The gist of it is that there is a ”sea anemone” on the screen with 5 or so tendrils; at the end of each tendril is a loop that you can grab and pull with the mouse. Each tendril makes a unique sound/pitch, and when you pull a tendril further away from the body, it gets louder. When you let go, it slowly retracts and gets softer. There would be other objects in the scene that you could interact with, such as pins or fishing hooks that would prevent the tendril from retracting.
What would make it even neater is to use Chipmunk to have all of the pieces be physical bodies — a central mass connected by springs to the outer masses.
I wonder how long it would take to create such a toy using Rubygame 3.
Alas, the audio support in Rubygame is woefully inadequate, and messy as well. I don't want to bog myself down adding even more new features for 3.0.0, but I could restructure the Mixer module enough that it could be expanded in the future without breaking backward compatibility.
“A well-used door needs no oil on its hinges.
A swift-flowing stream does not grow stagnant.”
– The Tao of Programming
Eat your own dog food.
– Popular wisdom
Recently, I worked on a simple game using Rubygame: an implementation of nonograms (also known as Picross), a logic game where you use number clues to reveal a simple drawing.
The game itself is in a pretty simple state, but I'm quite pleased with the code behind it. In addition to giving me a chance to use Rubygame, it also gave me a chance to practice writing unit tests / behavior specifications, something that Rubygame itself is rather lacking.
I have no idea when I'll release the first version of it, but here is a nice screenshot of a puzzle in the process of being solved:
Here's a simple example of when the RGB color model fails to accurately model real life interaction of light and color.
If you drive through any of the tunnels through the Appalachian mountains on the U.S. East coast, you'll likely be greeted by the ugly yellow-orange glow of a low-pressure sodium vapor lamp.
LPS lamps only emit light around the yellow-orange wavelength. As a result, in situations where they are the only light source, such as deep within a mountain tunnel, everything loses its own color, instead becoming a shade of yellow-orange. A car which had a lovely blue hue in the full-spectrum light from the sun will suddenly look near-pitch black once you enter the limited spectrum lighting of the tunnel. A white car, which reflects light on multiple wavelengths, will be much more visible, but still entirely yellow-orange. A red car or a green car would probably be just a little bit more visible than the blue car, but you'd be unable to tell that they were red or green if you hadn't seen them in daylight.
Over a year ago, my color studies class visited the Krannert Center for the Performing Arts at the University of Illinois, for a guest lecture/demo on theatre lighting, and the interaction of light with colored objects.
By the end of the demo, I had realized something: RGB just isn't enough to describe the full range of color interaction. And it's not just RGB that's deficient; HSV, HSL, CMYK, etc. all suffer from the same limitation. In fact, any color model which tries to describe a color as a single point will fall short.
Quick thought: A game where the music gets more out of tune and broken as you tumble towards defeat.
Meteos did this, sort of. It would shift between different music clips depending on the scenario. If your blocks were almost to the top of the screen, the music would become frantic. It was all pre-recorded clips, being mixed in realtime. It wasn't generating music on the fly.
Super Mario Bros. did something almost sortof similar; when time was running out, the song would speed up to double-time, and at a higher frequency as a result, to encourage you to go faster.
But I'm thinking of… well, more like Eternal Darkness' insanity meter. I don't know if they did anything with the music in that game (I only played it briefly), but it would be fitting if, as your character started to go insane, notes in the music would start to shift off-pitch a bit, or it would occassionally hit the wrong note, etc.
This would be definitely be easiest with MIDI or MOD style music, rather than with MP3 or OGG where all the waveforms are mixed together and hard/impossible to un-mix.