Some Programming Thoughts




An Approach to Problem Solving

I was formally trained in Systems Engineering. And of the many things I was taught, two stuck in my Mind the most:

1) If you can solve one problem, you can solve any problem. Simply find the domain mapping function from the solvable domain to the problem domain. (Not always that simple.)

Example:

The cliche is: "If all you have is a hammer, then everything is a nail."

But real life has some screws, too. So what do you do?

You recognize that a screw is a nail in a rotating plane. Apply the rotating domain space transform to the hammer and you suddenly have a screwdriver. Granted it may not be the most pretty, but with a simple twist of domain space you have a solution to your problem and a new tool for your bag.

and

2) Solve for n=1 and then n=n+1.

Solve your problem for one user/object. Then solve it for n+1 users/objects. I tend to jump ahead and solve for n+1 straight out.

There was a time when applications had hard limits on stored data. Something not at all dissimilar to the Y2K problem. A Sales Order might only have space for 10 line items, a single payment method and/or one generic tax. (Some production applications in use today still do.)

By programming for n+1 you can avoid many limiting factors. By assuming that there are no assumptions you infer no arbitrary limits.

One simple technique is to use arrays as a primary data type. Even when you're really pretty sure you don't need one.

What gets us into trouble is not what we don't know. It's what we know for sure that just ain't so.
- Mark Twain

Art, Craft and Knowledge

I recall Socrates describing the duality of knowledge: technical knowledge, that which can be written down, passed on, taught in schools (the system of doctrines); and the Art, that which comes only from the practice of repeatably applying that technical knowledge (the practicing, he may have used another term). A key point being that technical knowledge can be taught, but the Art cannot, it must be achieved. The two together make up the total Knowledge. (Pardon to all scholars if my recollection is a hack job.)

I mention this only because there seems to be a contemporary confusion with regards to the craft of software development. There appears to be some confusion between academic knowledge of programming (the system of doctrines) and the ability to apply this knowledge in a meaningful way (the Art). The culmination of these two being the Grok.

One's success with the System of Doctrines is a good start. It is, however, not the only start. Also, sadly, it is in and of itself, of no value. It's value comes only from it's application in achieving Grok and then only to the extent that it accelerates the process as opposed to other starting options.

One's length of service in the Art is indicative of - nothing. There are too many levels of the Art's application, too many degrees of involvement or practice and far too many standards of measurement. All of which are concocted and arbitrary.

And none of these in any manner correlates to one's ability to absorb experience and/or knowledge. Mastery of a Craft comes in degrees and is time invariant. There are prodigies as well as those who will never master anything no matter how long they practice.

So then what? If the measurement of training and/or claimed experience is not authoritative, what is?

I was once asked to perform some computer forensics for a legal case. I began my report by saying, "The presented facts are, in and of themselves, not authoritative. They are however indicative. And when placed within the proper context, a reasonable authority can be established."

The answer is: Context.


Language Preferences

On a routine basis you can read about the next new language and/or why your language of choice is superior to the other languages. Sometimes you can read about the Holy Wars between languages. Without trying to insult any of the zealots, it's all rather laughable. At the end of the development cycle all those programs in different languages run on the same hardware. And the CPUs don't know or care what language you wrote the application in. Unless of course, you anthropomorphize CPUs and imagine them sitting around laughing at your code. (And they do, because if they didn't laugh, they'd cry.)

I think of my first language as C, even though academically it was Fortran. I like C. Why? Because it makes Assembler easier. Which is the same reason I like C++. (Because it makes C easier, sometimes.) Beyond that, I really don't care. As long as the machine does what I want.

Then there's the issue of "When in Rome." If you're in the user's browser your language of choice is most likely Javascript. If you're in an Excel spreadsheet you might be using VBA. Context is everything.

But I do question why anyone is comparing their language to C/C++? I can't be the only one who sees the irony.


Paradigms

Object Orientation, Procedural, Functional, Waterfall, Agile, et al.

Since programmers first rubbed two transistors together, we have always been improving the art. Every so often a new paradigm is introduced which will forever revolutionize the industry. I think George Carlin said it best when he talks about "New and Improved."

The trick, as it were, is not to succumb to every new paradigm. Some work for some people and not for others. There is no silver bullet. The reason for this is that a programmer must grok the paradigm for it to be of any value. Even if one claims to be a programmer that does not mean they grok the art of software development. And such a foundational requirement must be met before one can grok any additional enhancements or improvements. Copy and Paste, ultimately, will not work. Hence, the sporadic and inconsistent success or failure of paradigms.

If one groks software development, paradigms become entertaining. They either articulate that which the developer has already practiced, though not in a formal way, or they are seen for what they are: The Emperor's New Clothes.

To be fair, on occasion the formalization of existing practices can be shared in a constructive way. It's how we learn from each other. Learning is never bad. What's bad is wasting your time with nonsense. The grass isn't always greener on the other side. Remember that someone selling you a new paradigm is, in fact, SELLING. Selling isn't bad. It just just doesn't build software. Much less well written software.

Waterfall and Agile are not competing philosophies. But they can be turned into shackles - don't.

Object Orientation is an abstraction, not a philosophy. And it has nothing to do with physical objects. Please stop teaching it that way.

Abstractions are good only until they become Distractions. We are not coding academic dissertations. We're talking to a machine in it's language, because it doesn't even have contempt for ours. Code to the machine, not the abstraction. It's a "forest for the trees" problem.
( And while I'm at it: Stop wasting the machine's time! )

All programs are procedural. It's a matter of scope and the degree of wrapping. The 'goto:'s are implied.

Hardware, like Life, has State. Respect it.

Event based programming is an illusion. It's a convenient illusion.

But don't forget:
There is no spoon. - Spoon Boy, The Matrix

On the other hand, hardware interrupts ARE NOT an illusion. But they can be masked.

Declarative programming will take over, as soon as CPUs stop being so imperative.

Managing programmers is like herding cats. Get a laser pointer.


Edge Cases, Corner Cases

I wish to comment very delicately on this topic, so as not to enrage anyone. I cannot say, with any authority, that there are no edge/corner cases. I just have a healthy disbelief in them. When faced with a generic algorithm that has a number of edge/corner cases, I tend to think that the algorithm's scope is not broad enough. And I move to improve the algorithm.

I once programmed a Point-of-Sale application. It was very typical. All except for Bob. Bob was an edge case. Whenever Bob bought something all the prices changed. There was special pricing for Bob. But it was made painfully clear that it applied only to Bob, so just make Bob's pricing the edge case.

Rather than making Bob an edge case, I made Bob an instance of alternative pricing. In fact, before I was done, everything shifted to alternative pricing and retail customers were just another instance, they were the default instance.

The result was a system that supported promotional specials, time sensitivity, seasonal, closeouts, member discounts, etc., etc., etc. Oh, and of course - Bob. (Who, by the way, was a relative of the owner.)

Pricing was no longer a field in a database, it was it's own algorithm, keyed by a sparse array. Edge case gone. Horizon widened.

This is an example of programming for Bob = Bob + 1.

I take this approach, because coding edge/corner cases is a slippery slope. One that should be approached with trepidation and one that often leads to the next topic.


Spaghetti Code

We all know what a nightmare spaghetti code is. And, in theory at least, we all try to avoid it. Some believe that approaches like OOD, frameworks, etc. can prevent or limit this. I do not. I would humbly suggest that there is not now, nor ever will be any external approach to coding that can prevent or limit spaghetti code.

Spaghetti code can creep in for any number of reasons. Sometimes it's a quick fix, that becomes too many quick fixes. Sometimes the customer is at fault for constantly changing the requirements. Sometimes it's the consequence of meeting deadlines. Sometimes it's the result of the application's growth. Sometimes it's just sloppy programming.

But what's the solution, if any?

A disciplined mind. A developer who is constantly vigilant for spaghetti code and siezes the opportunity to transform it into a better algorithm. No matter how well you plan your application, the code of an application evolves. It doesn't just happen. It is not the precipitate of an exact engineering formula(s). If it was, a programming application could write other applications by now. By recognizing that code evolves and looking for spaghetti you can transform it into structured flow. This is the difference between growth and maturity. And some really neat features may follow.

McCoy: I'll say one thing, Spock. You never cease to amaze me.
Spock: Nor I myself.
- Star Trek V: The Final Frontier

Back to Top
Top

© 2012 and beyond Lawrence L Hovind - All Rights Reserved