Tuesday, April 24, 2012

30 days and 30 nights of Vim


or, how I learned to stop worrying and love the registers


My trial period for ViEmu from http://www.viemu.com/ ran out yesterday. I have been using it in my day and night coding during the last 30 days of work, hoping that having it installed and trying my best not to turn it off, would force me into learning how to use it. I would say that I've pretty much managed to get the most out of my 30 day trial period, being that I've been in crunch the whole time. So I reckon I've managed to easily amass over 300 hours working with the plug-in, or using Vim directly in other contexts. I have gone from really not getting it at all, to installing Vim itself and making it the default program to read, write, and edit text files of all varieties from logs to xml to batch files to source.


So, 30 days later, 30 days from installing the plugin, what are my thoughts? Why did I change from "Visual Studio is just fine as it is" to "Oh my, it feels so clunky" ?


There were downsides, and most of them came from ViEmu being in visual studio. I don't miss +w, c to close windows. That was quite infuriating, and in the case of find-in-files and compilation windows, it just didn't work, just like searching in those windows didn't work either. but it's only a shortcoming of me not knowing how to push past the unbindings of ViEmu and set up a mapping. That much I've learned: the reason you're not productive in vi is never vi's fault, it's that you didn't configure it for your purpose. vi is naturally configured for all purposes at once, and once you get your head around the fact that this "programming" tool, is also a diffing tool, a writing tool, and a reading tool, you begin to see why, even with lots of special key combinations there's only so much you can fit on one keyboard. In short, vi is setup for everyone by default, you need to make it your own.


Post ViEmu, I now find myself opening a file and hitting / to find things, then after realising that's not available, I go back to +i, and then when I find the first occurrence, I find myself replacing it with n. Okay, so I get it, the muscle memory has already soaked in for the basic stuff. I now find Visual Studio slow to manoeuvre around. Finding forwards, backwards, moving my counts of words, all those things I'm missing already, but the main thing when it comes to editing existing code is operations on text. The most basic one I miss is the one mentioned by the developer of ViEmu: duplicating a line ready to make changes.


http://blog.ngedit.com/2005/06/03/the-vi-input-model/


What else I am missing already is how fast the macros were. Visual studio macros used to be fast, then they seem to have stuck in an entire visual basic backend the size of a small house and creating a quick macro takes ages, but not quite as long as it seems to take to play it back. I'm not sure how they took a system that worked and broke it so hard, but I've not seen any point in using visual studio macros since C++6.0. They're just too slow. Anyway, in vi macros are amazing. They can be created and reused incredibly fast, and they can be edited too without requiring switching out to a vb editor. In fact, they can be edited in the file you're working on. For those of you not in the know, I'd suggest that macros and registers are actually the main feature you want to get your head around. Once you have tried and fallen for this mind-blowingly simple system that has so many emergent properties, you'll never want to go back. For myself, it was at least two weeks into my learning before it was pointed out to me that macros and registers were the same thing. That was a long time to wait for learning about such an amazing feature.


http://www.infoq.com/presentations/Vim-From-Essentials-to-Mastery


But I'm sure there are more things that are amazing. I learned that I need to know about ctags, but still haven't. I have learned that I should learn how to read the help, but I haven't. I know that I need to start compiling my own .vimrc, and I still haven't properly other than some basics like the font and my preferred windowsize.


Take the time to do it. Learn vi, not learn how to open a file and press i, but actually learn vi. You won't regret it unless you give up. It took me, a long in the tooth modeless editor fan, two weeks to get good enough with it to say it was bearable, and four to say I can't live without it.


It's on very machine I work with now and it's not coming off.

Sunday, April 08, 2012

Remember time then doing analysis of space


Some elements or development have time and space tied to each other in such a literal way that's it hard to think of a reason not to worry about both at the same time.

Take asset compression.

For physical media, there is a certain amount of compression required in order to fit your game on a disc. Without any compression at all, many games would either take up multiple discs, or just not have much content. With compression, load times go down and processor usage goes up. But, how much compression is wanted? There are some extremely high compression ratio algorithms around that would allow some multiple DVD titles to fit on one disc, but would it be worth investing the time and effort in them?

The only way to find out is to look at the data, in this case, the time it takes to load an asset from disc vs the time it takes to decompress it. If the time to decompress is less than the time to read, then it's normally safe to assume you can try a more complex compression algorithm, but that's only in the case where you have the compute resources to do the decompression.

Imagine a game where the level is loaded, and play commences. In this environment, the likelyhood that you would have a lot of compute resources available during the asset loading is very high indeed. Loading screens cover the fact that the CPUs/GPUs are being fully utilised in mitigating disc throughput limits.

Imagine a free roaming game where once you're in the world, there's no loading screens. In this environment, the likleyhood of good compute resources going spare is low, so decompression algorithms have to be light-weight and assets need to be built so that streaming content is simple and fast too.

Always consider how the data is going to be used, and also what state the system is in when it is going to use it. Testing your technologies in isolation is a sure fire way to give you a horrible crunch period where you try to stick it all together at the end.