Matt's Mind

Sunday, July 03, 2005

Lisp

I listened to a podcast of Guido Van Rossum, the architect of Python, talking at SD forum recently and, although it made me vaguely irritable the way dynamic typing people often condescendingly assume that "strong typing is for weak minds" (actual quote from an audience member), it did resurrect my desire to learn a new language.

Not Python - despite Pythonistas' claims I've done enough Python to convince me that it isn't different enough from other OO languages to change the way I think about programming. Indeed, since Python didn't start out OO, it can be slightly obtuse in that area - I think I'd like to avoid writing "self.foo" all the time to reference data members, and avoid, too, the subtle bugs that come with forgetting the "self" (I realise that only weak minds would make this mistake though).

I recently revisited Paul Graham's writings on Lisp, and his claim that all languages strive to become Lisp (Guido touched on that too in comparing Python to Lisp), so I decided to have another go at that disturbing language (I'll elaborate on the "disturbing" part next). Luckily Paul Graham has a very good book available online called "On Lisp".

What I immediately found from the first few chapters of Graham's book is that, although I can recognise deep down the elegance of Lisp, I find I can't "think" in it at all. I had the same experience at Uni looking at examples in Lisp and Miranda - after some forehead-wrinkling and mental contortion they finally clicked in my head and I was able to see the mathematical elegance of the solution. Although the elegance can be somewhat lost when "real" Lisp programmers try to avoid non-tail recursions for performance reasons and make the code that much less clean - see section 2.8 of the book if you care.

The problem may be that I had already learnt to program before I went to Uni. I learnt snippets of basic on my friend's Commodore 64. Then I moved to MS GW BASIC when my family finally got our own, IBM compatible, PC. From there I moved to QuickBasic, which lived up to it's name - I've never had such a speed improvement for "free" by just by upgrading compilers, it was almost like magic. QuickBasic had examples of programming with, gasp, procedures with no line numbers. Actually, this procedure thing wasn't totally alien, I knew about gosub/return, but named procedures with actual parameters, wow, what a revelation when I "got" that idea.

From QuickBasic I moved to QuickC. C actually wasn't too much of a leap from procedural basic, except of course pointers and complex data structures were new concepts. But basically C had the same core forms as basic (pointers and malloc/free excepted). And it was a "real" programmer's language. For the first time I was able to develop (what I thought were) real applications. I eventually got into writing low-level 8086 assember inside C to do things like write to the text mode screen buffer, execute the terminate and stay resident DOS call and access the then new MS mouse BIOS driver API's (and I can still remember that it was interrupt 33H).

My rambling back down memory lane is not intended to show that I can't stay on the point. It's intended to illustrate that I've had nearly 20 years (and I'm only 33) of procedural thinking. And while my current language of choice, Java, is a nice language, it wouldn't be at all alien to the me of 15 years ago.

So, I'm getting that same "hard of thinking" problem when reading the Lisp in Paul Graham's book. After some staring, I can get the examples, but I wouldn't have had the first clue how to write them. So it appears learning Lisp might be like my attempts to learn to touch-type, and fail for the same reason - while learning my productivity would be so painfully bad I'd give up.

It does vaguely worry me that the facility with which I can write code one way might be paid for by the inability to learn new, better, ways of doing things. After all, I somehow suspect that we haven't achieved perfection with current languages and tools.

In spite of this, or perhaps, because of it, my own belief is that language has very little overall effect on productivity. Things like GC, protected memory, sophisticated tools, copious good-quality libraries and a massive community support seem to be much more important factors than whether I can sort a list using a custom comparison function in two lines of code rather than 6. And the key factor? The quality of the developer obviously makes the most difference.

But of course I may be wrong. How can I really know whether I'd be a much better developer if I could think in Lisp? But I suspect the human mind at my age (i.e. post adolescence :/ ) has room for only one major language, programming or otherwise.

I'll keep at the Lisp though.

4 Comments:

  • Some intersting points matt, but I'm not sure I agree with the language one.

    When I code, I always wonder, "why? why I am doing this this way? why has the world evolved to this?"... seperate compilation and text file source is a classic examle. Why? Well, we need a seperate compile / link because the PDP had limited memory. We needed seperate files because again limited memory to build a parse tree. We use text file because it's easy and standard.

    But as we talk about larger programs, XP/RAD style development, we want tools to build things faster. Ok, so now we have Java which has a lot of good points to it. Fundamentally, our code base is now stored in a structured database (really) and is a multi-pass single phase compilete and link.

    We have automatic syntax and semantics checking being done transparently in the background.

    But, I'm still writing a *LOT* of code and having to wade through a *LOT* of classes (when using patterns). The next step has to be a language that supports the transparent implementation of patterns far better than we do now.

    More importantly, I find OO type programming only works for small self contained (or small company) project libraries. We have things like the gjt, etc, but the 'reuse' panacea has not been reached yet. Library interdependence is still a nightmare, as is duplication of code.

    One of the key benefits of Java was IMHO the VM / VM system. ie. not the java language, but the platform independet API you were compiling against - this is the same reason .NET is so cool. It means as a developer I only have to write my application once against this guaranteed API. Whether I run it inside a JVM or statically compile it, who cares? It's like a more standardised version of POSIX.

    However, we are now seeing fragmentation in areas of Java, and this is a Bad Thing (TM) (eg. eclipse SWT, Swing, beans, etc).

    Hmm, I know I'm rambling here, but I think the following things are needed for the 'holy' grail of language:

    * java/.net like VM/ API
    * choice of interpreted or static compile (ie. small systems)
    * native language support for patterns (eg. singleton, adapters, etc) - reduces class explosion
    * more community based development for enhancing the 'standard' libraries.
    * continue with the great IDE's (netbeans / eclipse)
    * native support for loose typing variables (eg. like basic used to do it).

    Imperative / OO languages are not the total panacea, but they are far easier to understand for 95% of instances that functional implementations.

    And the big hooha over scripting languages is just because good IDEs are too expensive or too hard to learn for most people.

    Anyway, just my $7.22 :)

    By Anonymous scharman, at 3:42 pm  

  • I think we agree that the important value in Java/.NET/whatever is not the language, it's the platform.

    I've spent significant time programming in a number of procedural languages (BASIC, C, C++, Ada, Pascal, Perl, Java), some OO, some not and I can't think of anything about the language that's made a significant difference in productivity between them. The main reasons I feel much more productive in Java than any other language so far is (a) the platform (GC, threads, memory protection, strong typing), (b) the tools (eg Eclipse) and (c) the massive availability of good open source libraries.

    I'd disagree about reuse not getting more significant. Compared to five years ago, we're using a lot more 3rd party software. There are still cases of conflict, but most projects I'm working on right now use four or more 3rd party libraries, whereas the ones I worked on when I just got out of Uni used only libraries supplied with the system.

    A classic example: when I wanted to add RSS/Atom parsing to a project recently, it literally took 2 hours to find a library and hook it in. It used the same XML parser as everything else (via JAXP), the same net libraries (java.net), the same logging API's, etc. Zero impedance. Unlike a decade ago when I'd have to integrate its build system, threading library, network access, etc. I probably wouldn't have bothered.

    SWT may cause some fragmentation, but it's also getting Sun to get its shit together with Swing, something that just wasn't happening previously. Swing is a train wreck right now, and SWT is a chance at resolving the problem one way or another (kill it or fix it).

    People think that there is a panacea awaiting us if we develop a better language. I hope they're right, but can't see any evidence that any current language is an order of magnitude more productive.

    By Blogger Matthew Phillips, at 11:53 am  

  • Hey Matt, just read your Lisp post from July :-) How's it going? Maybe you want to try Ruby or Scala :-)

    BTW I think a lot of the ppl talking about strong/weak/static/duck/etc typing are mis-using & misunderstanding the terms. The real difference is _static_ vs _dynamic_ typing. Eg Python _is_ strongly typed, it's just not statically typed.

    There's another issue too which is whether names for things (ie variables) have associated types or whether it's just the values themselves that are typed.

    Cheers

    michael

    By Anonymous Anonymous, at 11:37 am  

  • Hi Michael. I haven't followed up any further on the Lisp. I think it may end up being like touch-typing after all :/ However, my current language-fixation-du-jour is Haskell looks more hopeful. I really like the fact that it actually manages to painlessly incorporate super-strict typing into a very elegant functional language.

    I agree that people often argue cross-purposes about dynamic typing. I don't think there'd be too many serious software engineers that think that no typing at all is a good idea. My central objection to dynamic (runtime enforced) typing (a la Python) is that, while the types are indeed there, but they're not explained very well to the compiler or the reader of the code. If you're going to use types, why not make them explicit and let the IDE help you with the API and any misunderstandings you might have about it?

    Especially for newbies, the ambiguity caused by no explicit type info being in the API can be infuriating. I cited an example I came across in a previous blog: my immediate question was "what's a fp?" In Java in Eclipse I'd just control-click the class name and jump to the source, show a type hierarchy, discover where else that class is used, etc. In Python you're supposed to run some code and inspect the contents or something. Python may be easier to write, but it can be much harder to read in consequence.

    Haskell, by specifying types in fairly precise and flexible way, seems to have its cake and eat it too. We'll see if I have the concentration span to take it any further :)

    Ruby looks nice, haven't heard of Scala. Have you used either in anger before?

    By Blogger Matthew Phillips, at 7:13 pm  

Post a Comment

<< Home