I’ve written 17 books now, counting the various editions, and with every chapter of every book there are decisions to be made about what to discuss and what not to. In a way, publishing becomes an endorsement, although I’m really not an endorsement kind of person. Which is to say that it’s never my intention to sell people on things and I don’t believe that my way is the only way let alone the best way to do something. This may all sound odd, but it’s the approach I have, for better or for worse. (And, truth be told, there are still many times when I feel strongly that X is the way to do something and Y isn’t.) Really, what I feel my job as a writer is, is to take all the information I come across, from reading other sources, from listening to the experts, and from my own experiences, and synthesize it all into a coherent bundle of knowledge. Then, of course, convey that knowledge in an easy-to-follow manner. Anyway, my point in this forward is that there are many things that I haven’t written about and that I don’t personally do but that may be worth other people’s consideration. One such practice is the use of Hungarian notation in programming.
Simply put, Hungarian notation, or any similar kind of application notation, suggests that variable and function identifiers (i.e., names) reflect the kinds of data they store or return, or generally how they’ll be used. So, for example, a variable used as a flag might be a Boolean named bHasOne or an array of names would be stored in aNames (these are greatly simplified notation syntax; you can see more examples in the Wikipedia article). One primary benefit of using such a system is having a consistent naming scheme, which is always absolutely critical. Programming notation can also make your code easier to read, as the identifiers themselves become de facto comments. That being said, that same Wikipedia article, and several other places online, have big name people suggesting that Hungarian notation is redundant and unnecessarily tedious. While it may be useful for untyped languages, like PHP (where you can easily, inadvertently change a variable’s type), strongly typed languages and OOP specifically have no use for notation. Again, I don’t personally adhere to these conventions but thought it’d be worth mentioning for those unfamiliar with the concept.
(And, as an aside, if you’re looking for an interesting read, here’s a good, long article on all sorts of coding issues.)