August 20th, 2009
Internet eccentric and Ruby patronÂ why the lucky stiff has apparently vanished, along with the sum of his work. It’s a damn shame. I’ve never coded a line of Ruby but he was the kind of programmer the world needs more of.
“when you don’t create things, you become defined by your tastes rather than ability. your tastes only narrow & exclude people. so create.” – _why
That quote is gone too, since his twitter account is gone.
EDIT: Here’s some more. The last ones are pretty recent, and give some indication as to what happened.
“if you program and want any longevity to your work, make a game. all else recycles, but people rewrite architectures to keep games alive”
“programming is rather thankless. you see your works become replaced by superior works in a year. unable to run at all in a few more.”
August 3rd, 2009
With the release of Flash 10, Adobe added some support for GPU’s via OpenGL and DirectX. The Flex SDK currently offers no way to flag any appropriate bits for this, so recently I endeavored to set this bit myself and see what the wonders of the GPU had to offer.
I wrote a small script to do this, but there is no way I have found to actually know if it’s working, save for graphical anomalies in some circumstances. I couldn’t see any difference in Linux (even with Compiz turned off), so I loaded up ye olde XP and tried it there with no obvious results.
Frustrated, I tried manually setting the wmode of the swf in an HTML page and finally saw some differences (although only in XP). Unfortunately the framerate decreased, something that I suspected might happen, but even worse the framerate seemed even more inconsistent and choppy than under regular rendering modes. It might’ve been because I was using it inside a browser or maybe due to my numerous framerate shenanigans. But more importantly the bit I flipped seemed to do nothing at all.
So while at first I was miffed at being left out of the Flash GPU club, now I’m content to wait longer. There’s no indication of what, exactly, these features are supposed to offer that I’ve found, nor any way to even be sure if you’re getting any benefit/hindrance from them. Unless you’re in Windows and have an eye for graphical anomalies I suppose.
Here’s the python script I used to set the aformentioned UseGPU bit (from the command line: python setusegpu.py <swf>). You can modify it easily to set the UseDirectBlit bit too (see the code for details). I’d love to hear if anybody can get better results than me, or maybe tell me how wrong and foolish I am and I’ve been misinterpreting everything.