The British contributors to algebra belonging to the generation of Abel and Galois, on the other hand, set out to establish algebra as a "demonstrative science." These men were strongly affected by the fact that England's analytic contributions lagged behind those of the Continent. This was attributed to the superiority of "symbolic reasoning," or, more specifically, of the Leibnizian dy/dx notation over the fluxional dots still prevalent in England.
I find this to be rather funny, but also a bit of an object lesson. Essentially, the British blamed their analytic inferiority on bad notation. It wasn't that they lacked the machinery to do analysis, just that their representation of it was difficult to work with to the point of impeding their progress. To put it another way, they blamed their lack of achievement on bad syntax. It was beaten into me in several math classes in college that good notation leads to insight. I don't see why a programming language's syntax should be thought of any differently.
No comments:
Post a Comment