In his last blog post “Watching the Dawn”, Fred Kern comments on the life of an engineer before the realization that symbolic approaches to computing can get you better results faster. The analogy is, of course, prior to this revelation we were in some sense in the dark. I’d like to add my two cents worth as I was indeed one of those engineers lurking in the dark for many years.

Flash back about 20 or so years. I was a poor graduate student and to feed myself, I began doing small jobs for this new company called Waterloo Maple Software (which eventually became Maplesoft). Mostly, my work was to develop small applications or demonstrations with an engineering focus. I remember with great fondness, the look of shock and awe that would come over my engineering colleagues’ faces when I showed them how I computed symbolic matrix products or performed a cumbersome simplification in seconds. For me, it was an obvious thing to do because I had access to the technology and I didn’t know any better. But for them, it seemed like pure voodoo. But in reality, the common themes that I somehow fumbled upon during these early presentations would later reappear in much richer, exciting forms as core themes in the eventual “symbolic sunrise” twenty years later.

The core themes, in my mind were,

**Automate manual processes**. An engineer’s jaw drops when he or she sees Maple or MapleSim for the first time because it literally replaces weeks, if not months of brutal manual mathematical labor in a matter of hours. In my standard product presentation I always begin with a common transfer function manipulation, which if done manually (as I had to do earlier, or as countless people still in the dark today, do), it takes about 20 minutes if you don’t make a mistake. With Maple, total time start to finish is about 10 seconds. Beyond this simple example, the engineering world is full of similar kinds of brute-force manipulation: any of the transforms, partial fraction expansion, series expansions, anything involving matrices, and countless others.

**Hold off on the numbers**. Of course with a traditional numeric tool of any sort, first cardinal rule is assign numerical values to your variables and parameters and then define the expression. This always seemed very odd to me because the nature of engineering was that you did not know the values and that’s why you needed models and analysis. So, by definition, numerical techniques force you into a guessing game from the start. The best numerical routines make the process of guessing and re-guessing more efficient. The symbolic approach is of course, leave those variables alone, work with the fully expressive mathematical expressions, see if you deduce something useful from the mathematical structure, or transform the math to a more convenient form, and when you’re finally ready, plug in the values and get your final answer. This to me was also more intuitive and consistent with how human beings worked, and I was convinced that this is how computers really should behave as far as engineering math is concerned.

**Fastest programming language known to human-kind** (for certain types of applications). Back then, Maple was principally a programming language. Main differences from other more familiar ones at the time were, it was interpretive (i.e. easy to manipulate and debug), it was richly mathematical (i.e. I didn’t have to hunt down my own matrix inversion routine), and it was extremely forgiving (i.e. because nothing was locked to values, you could mix and match all sorts of weird and wonderful data structures). In the kind of work that I did at the time, this meant that I could develop fairly complex modeling applications in a fraction of the time as my peers. Furthermore, because of the symbolic advantage, I could instantly add interesting analytical steps that I’d dig up in a paper, a book, or a late night chat with my supervisor. Finally, to top it all off, my code was about a page in length while others had reams of fanfold dot matrix printer paper with thousands of lines of FORTRAN code.

That was the late 1980’s. Why did it take 20 years for the new dawn to finally break? My observation was that most engineers seemed at the time to be obsessively interested in the final solution only. The vast majority of engineers had only experienced computers as tools to solve for the final values. I recall countless discussions with engineers trying to get them to pay attention to the time they were spending on setting up the problem rather that the fact that the solution in a natively symbolic system was slower than its numerical counterpart. “Why are you so interested in 1 second vs. 0.01 second when you just spent 5 weeks figuring out how to get this problem on the computer?!?” I’d ask rhetorically. In the end, it was a long tough road. The formulation side of engineering computation was not clearly articulated or cohesive in any sense, whereas every engineer since 1960 understood practically everything they ever needed to know about numeric solution algorithms. And the reality was, even with obvious shortcomings, numerical tools produced usable answers in predictable ways.

This new dawn of symbolic computation, from what I can see, is resulting from some key factors. First, industry is waking up to the need for more rigorous development of mathematical models. Systems are becoming more complex and we can no longer function well within the blackbox framework of numeric computing. Second, several key formulation techniques evolved sufficiently to offer robust, predictable formulations for automatic model equation derivation. The best example is linear graph theoretic techniques for multibody systems (this is the one that MapleSim uses). This meant that you could clearly communicate and demonstrate consistently, efficient, and advantageous approaches to automated modeling for complex, industry-ready problems – hence saving weeks at the front of the process rather than microseconds at the tail end of the process. Finally, there have been natural improvements within the core symbolic computation world such as polynomial operations, code generation tools, expression simplification algorithms, and yes, raw numeric speed to complement the flexible symbolic tools that were directly applicable to a broad class of engineering problems. So the basic foundations that I stumbled upon 20 years ago were there from the beginning but it took a lot of progress on multiple fronts and non-trivial factors in society to finally fit all of the pieces together.

I like Fred’s metaphor of the dawn, but the romantic in me prefers the image of courageous revolutionaries lurking in some computational jungle, keeping the symbolic spirit alive until one magic morning the forces of good finally prevail and all engineers reap the benefits of a truly integrated approach to modeling. J In the end, I’m not sure either metaphor is really better than the other or even necessary. The truth is, engineers are deploying symbolic techniques, not because of some epiphany but because today it makes good business sense … and it works.

*The Dark Ages? Receiving my PhD (1996) after convincing a group of engineers that symbolic computation might be a sensible way of designing an airplane … I guess they believed me :-) My new son Eric soaks it all in as he prepares his lifelong strategy to challenge every move I would make to get him interested in math and science.*