My first gaming experience was Hunt The Wumpus on a Texas Instruments 99. But what I really sunk a lot of time into was the original Nintendo. My favorite was Metroid, then Kid Icarus. I never had any of the later Nintendos, but I did play quite a bit of The Lost Vikings on NesTicle. Platform games always fascinated me... the huge map (seeing the full Metroid map with the secret worlds was somehow mesmerizing), the slowly scrolling gameplay, the cumulative experience, slowly working toward one far off goal through a labyrinthine guantlet.
Another question I had came immediately when I started writing platformista. I decided to start with the core of the game, the engine itself. The code that redraws the screen every few moments to give the viewer the illusion of motion on the canvas. The idea is that one has an ideal frames per second (fps) to achieve, but not all machines will achieve this fps. For example (using Jaws), you can see that on an iPhone / iPad / iPod / my 10 year old laptop, this game only achieves about half the target 60 fps. Well, what if you have a game that tries to depict a falling object? A naive implementation of fps would cause objects to fall slower on slower machines. In other words each tick of the user interface has to be separate from the ticks of the game code itself, or there must be some other mechanism of allowing the game to keep try of some sort of "real" time that is not dependent on the fps. I'm keeping things separte by having separate ui ticks and game ticks. This type of separation would also be essential in multi player games where things have to be synched among multiple machines.
You can check on the state of my game engine project here (codename: platformista).