Now, I've never actually played Ti-Boy (I should) but I know that the grayscale is not as good as it could be graphics wise. The reason why Ti-Boy can't get great grayscale is because it uses all of the cpu time emulating games. But I think I have found a way to make it work really nice.
For my Chess game, (which people have said has perfect grayscale,) I refresh the screen between 59-61 times per second. Of course the user adjusts this because every calculator is different. And 60 fps takes a pretty big toll, that's 50% of the cpu time. But, another option is to refresh the screen at 30 fps and still match the refresh rate. So essentially, write to the screen buffer, let it get displayed
twice then update it again. This method gets some minor scan lines, but it's consistent; there's no flicker.
So to Calc84:
I don't know how you do your grayscale. But if you are using interrupts, make a two stage cycle. Using the crystal timers, make the first stage a static length of 172 at speed $40 (10923Hz) When this interrupt comes back, don't do anything. Then send out a second interrupt of user adjusted length default at 172. When that comes back, update the screen and repeat.
Of course I don't know if this sort of thing is feasible, but if it is, you'll have people talking about how good the grayscale looks
. (If you get really desperate, you could even refresh at 20 fps, but I don't know how that would look)
Here's a demo to show what I'm talking about:
+ and - adjust delay, 2nd displays delay, and clear quits