Clean vs. ADA

Adrian Hey ahey@iee.org
Sun, 12 Apr 1998 20:51:12 +0100 (BST)


On Sat 11 Apr, Jim Hassett wrote:
> Is a Clean compiler really much faster than an Ada compiler?  The Clean
> home page describes the speed of compilers as generally in the thousands
> of lines per minute.  Here are some Ada compilation speeds that have
> been reported to me:

Well, its been a very long time since I used ADA (last time was in 91) if
I my memory is correct. I was using Alsys (I believe they're now called
Thompson somethingorother) version <I've forgotten> on a 33MHz 386 PC running
DOS version <I've forgotten>. I suppose you would expect that to be pretty
slow by todays standards. Trouble is, it was also extremely slow, even in
those days (compared with say... Borland Turbo Pascal, or even MS Pascal).
I would hazard a guess at about 20 times slower on a line for line basis.

I always thought that this might be because it was DOD certified, and
since ADA is a language designed to produce slow but bullet proof code
for defence applications, maybe the DOD demand that is should be compiled
in a very safe, but labourious, manner. On the other hand, I suppose it
could be because it was just a cr*p compiler. Maybe the compiler itself
was written in ADA and compiled on a DOD certified compiler. Who knows.

If you want to compare the speed of ADA compilers with Clean, I don't
think a straight line for line comparison is entirely fair, for the
following reasons:

1- If we forget optimisation, compared with Clean (or Haskell, or even ML)
   compiling an ADA program is pretty straight forward. Type checking is
   easy and you can just about translate ADA statements into executable
   code sequences on a statement by statement basis. No doubt if you
   want code to be 'optimised' a lot more work than this has to be done.

2- In contrast Clean (or Haskell or ML) compiler(s) have an awful lot
   more work to do. Even after type checking/inference is finished there's
   still a lot of juggling and transformation necessary (finding free
   expressions, 'lambda lifting' etc..) before the program is in a form
   remotely suitable for code generation. And thats still un-optimised.
   (The trouble with functional languages is its not at all clear what
   'optimised' actually means. If you want to get cleverer & cleverer
   you can go on optimising for ever, that's why they're so interesting.)

3- I would guess that the 20,000 line ADA programs you've benchmarked
   could probably be re-written in about 2000 lines (maybe less) of Clean
   or Haskell.

In short, you should compare compilation times for programs which do
similar things, not programs which contain similar Nos. of code lines.
Compilers for functional languages have to do a lot of work, but on far
fewer lines of code.

Recently Rinus said that I/O was the major bottle neck in Clean compilation.
If this is true then presumably an ADA compiler running on the same
platform would have the same problem. So we might find that Clean compares
favourably with the latest ADA compilers, even on a line for line basis.
Unfortunately, I can't back this up with any figures, perhaps someone
else can. (I suppose we should bear in mind that Clean would have a lot
more output to generate than ADA in this case).

Personally, I think the Clean compiler is amazingly fast, compared
to some other ML compilers I've tried. This is especially remarkable
considering it was written in Clean (or so I believe, no doubt somebody
will correct me if I'm wrong). If nothing else, the Clean team have
finally managed to kill the lie that functional programming languages
are inherently slow. (I never believed that one anyway, but it I've met
an awful lot of sceptics who did.)

Mark Jones Haskell compiler (HUGS) is pretty fast to, but that isn't
generating native code and it was written in C. So that's cheating :-)

Phew! another essay completed and submitted for marking. Sorry all my
messages are so long. I promise I'll go silent soon.

Regards
-- 
Adrian Hey