Macros considered harmful

Discussion of Common Lisp
methusala
Posts: 35
Joined: Fri Oct 03, 2008 6:35 pm

Macros considered harmful

Post by methusala » Thu Nov 05, 2009 8:01 pm

What are your thoughts on this paper that considers macros and lisp compilers to be undesirable?

2.1 Myth 1: Lisp Needs a Compiler
This is in fact the most significant myth. If you listen to Lisp discussion groups, the
compiler plays a central role. You might get the impression that it is almost a synonym
for the execution environment. People worry about what the compiler does to their code,
and how effective it is. If your Lisp program appears to be slow, you are supposed to get
a better compiler.
The idea of an interpreted Lisp is regarded as an old misconception. A modern Lisp needs
a compiler; the interpreter is just a useful add-on, and mainly an interactive debugging
aid. It is too slow and bloated for executing production level programs.
We believe that the opposite is true. For one thing (and not just from a philosophical
point of view) a compiled Lisp program is no longer Lisp at all. It breaks the fundamental
rule of "formal equivalence of code and data". The resulting code does not consist of S-
Expressions, and cannot be handled by Lisp. The source language (Lisp) was transformed
to another language (machine code), with inevitable incompatibilities between different
virtual machines.
Practically, a compiler complicates the whole system. Features like multiple binding
strategies, typed variables and macros were introduced to satisfy the needs of compilers.
The system gets bloated, because it also has to support the interpreter and thus two
rather different architectures.
But is it worth the effort? Sure, there is some gain in raw execution speed, and compiler
construction is interesting for academic work. But we claim that in daily life a well-
designed "interpreter" can often outperform a compiled system.
You understand that we are not really talking about "interpretation". A Lisp system
immediately converts all input to internal pointer structures called "S-Expressions". True
\interpretation" would deal with one-dimensional character codes, considerably slowing
down the execution process. Lisp, however, "evaluates" the S-Expressions by quickly
following these pointer structures. There are no searches or lookups involved, so nothing
is really "interpreted". But out of habit we'll stick to that term.
A Lisp program as an S-Expression forms a tree of executable nodes. The code in these
nodes is typically written in optimized C or assembly, so the task of the interpreter is
simply to pass control from one node to the other. Because many of those built-in lisp
functions are very powerful and do a lot of processing, most of the time is spent in the
nodes. The tree itself functions as a kind of glue.
A Lisp compiler will remove some of that glue, and replace some nodes with primitive
or
ow functionality directly with machine code. But because most of the time is spent
in built-in functions anyway, the improvements will not be as dramatic as for example
in a Java byte code compiler, where each node (a byte code) has just a comparatively
2
primitive functionality.

from:
http://software-lab.de/radical.pdf

ramarren
Posts: 613
Joined: Sun Jun 29, 2008 4:02 am
Location: Warsaw, Poland
Contact:

Re: Macros considered harmful

Post by ramarren » Thu Nov 05, 2009 11:19 pm

methusala wrote:What are your thoughts on this paper that considers macros and lisp compilers to be undesirable?
I skimmed the article, and, to be honest, it is ignorant and/or insane. The author doesn't seem to know what 'compilation' even means. I mean, CLisp is used as an example of a compiler. The quoted fragment alone shows total ignorance of type inference and optimization. It seems to be pushing Pico Lisp, for some reason, into domains where it is not useful.

dmitry_vk
Posts: 96
Joined: Sat Jun 28, 2008 8:01 am
Location: Russia, Kazan
Contact:

Re: Macros considered harmful

Post by dmitry_vk » Fri Nov 06, 2009 12:11 pm

Ramarren wrote: I skimmed the article, and, to be honest, it is ignorant and/or insane. The author doesn't seem to know what 'compilation' even means. I mean, CLisp is used as an example of a compiler. The quoted fragment alone shows total ignorance of type inference and optimization. It seems to be pushing Pico Lisp, for some reason, into domains where it is not useful.
I agree with your opinion.
Clisp is an example of a slow interpreter - when performance is important, comparison should be made to sbcl and clozure.
Article contains a lot of misguided opinions - e.g. the compiler vs interpreter, data types, dynamic binding, macros.

gugamilare
Posts: 406
Joined: Sat Mar 07, 2009 6:17 pm
Location: Brazil
Contact:

Re: Macros considered harmful

Post by gugamilare » Fri Nov 06, 2009 7:41 pm

I think the person that wrote this never heard about benchmarks :)
He must have a wrong idea of the flow of a program or something, or he never had to do a speed-critical program.

Bah, never mind. It recall me some engineers who create their own math theory, saying that the standard mathematic theory is completely wrong and acting like they know more math than mathematicians. Then you must just ignore them. If they are happy doing that, let'em be :)

methusala
Posts: 35
Joined: Fri Oct 03, 2008 6:35 pm

Re: Macros considered harmful

Post by methusala » Fri Nov 06, 2009 9:36 pm

Thanks for all of the feedback, it's valuable to hear what experienced lispers from around the world have to say on this. Now please allow me to play the "devil's advocate."

Comparing the speed of PicoLisp to the virtual machine based CLisp was not as good a test as comparing it to the machine code compiling SBCL. However Clisp is in widespread use and also probably isn't the slowest lisp. So the test at least supports that the interpreted PicoLisp is comparable in speed to a commonly used byte code compiled Lisp system.

But I think the bigger point that the article made, is that PicoLisp is Truer to the core purpose and value of the Lisp programming language, a language that is supposed to always and everywhere be directly programmable in itself, a physical embodiment of the lambda calculus. If large parts of the core of a Lisp system are concerned with adapting to the hardware it's running on, isn't that Lisp system taking a step away from lambda calculus and a step towards languages like c? PicoLisp only has 3 types-lists, integers and symbols.

To put it another way, if the strength and core of lisp is the lambda calculus, shouldn't Lisp programmers be asking computer makers to adapt to us rather us adapting to their computers? Computers get faster all the time, why not make use of that speed by making as little as possible of Lisp dependent on underlying c code. We can still have the type system for those who want it, but instead of implementing it with c code, implement it as a Lisp program based on lists, integers and symbols. Such an uncompromising Lisp system would also be the most easily adaptable to the many new, faster and ever more advanced computer hardware systems that are created each year, which is where the future is.

ramarren
Posts: 613
Joined: Sun Jun 29, 2008 4:02 am
Location: Warsaw, Poland
Contact:

Re: Macros considered harmful

Post by ramarren » Sat Nov 07, 2009 1:51 am

methusala wrote:PicoLisp only has 3 types-lists, integers and symbols.
That is not a good thing. I would prefer to program directly in assembly rather than with a limitation like that. For me, Lisp is about freedom, not being hobbled by academic concepts like "lambda calculus", which while useful in certain contexts, cannot and should not be a basis for all programming.

dmitry_vk
Posts: 96
Joined: Sat Jun 28, 2008 8:01 am
Location: Russia, Kazan
Contact:

Re: Macros considered harmful

Post by dmitry_vk » Sat Nov 07, 2009 7:04 am

methusala wrote: But I think the bigger point that the article made, is that PicoLisp is Truer to the core purpose and value of the Lisp programming language, a language that is supposed to always and everywhere be directly programmable in itself, a physical embodiment of the lambda calculus. If large parts of the core of a Lisp system are concerned with adapting to the hardware it's running on, isn't that Lisp system taking a step away from lambda calculus and a step towards languages like c? PicoLisp only has 3 types-lists, integers and symbols.

To put it another way, if the strength and core of lisp is the lambda calculus, shouldn't Lisp programmers be asking computer makers to adapt to us rather us adapting to their computers? Computers get faster all the time, why not make use of that speed by making as little as possible of Lisp dependent on underlying c code. We can still have the type system for those who want it, but instead of implementing it with c code, implement it as a Lisp program based on lists, integers and symbols. Such an uncompromising Lisp system would also be the most easily adaptable to the many new, faster and ever more advanced computer hardware systems that are created each year, which is where the future is.
The "lambda calculus" is not about lack of types (actually, lambda calculus says nothing about types). Lambda calculus is about combining functions to create new functions in different ways. Data types are not step towards the computer hardware, they are actually step away from hardware (in hardware, you have absolutely no types).
Computer languages can be compared on following bases: 1) primitive elements; 2) means of combining primitive elements; 3) means of abstraction (e.g., assigning name to some combination of primitives to be able to use combination as a primitive element). In Lisp, you have rich set of primitive types, you have means of combining primitive types, and you have means of abstracting types. In PicoLisp, you have very small set of primitive types and no means to combine them or to abstract them. This means that you are severely limited in what tasks you can solve. For example, you can't make good image processing library because you simply can't efficiently represent images with lists, integers and symbols.
Also, in lambda calculus you will nowhere hear that functions are represented as lists. You will see some functions, some lambdas, some parameters. Functions semantically are not lists. The whole "function is a list" is just a consequence of trying to fit everything into "everything is a list" dogma. Function is not a list. Image is not a list. Complex number is not a list. GUI Widget is not a list. Thread is not a list. File is not a list. Database is not a list.

findinglisp
Posts: 447
Joined: Sat Jun 28, 2008 7:49 am
Location: Austin, TX
Contact:

Re: Macros considered harmful

Post by findinglisp » Sat Nov 07, 2009 7:44 am

dmitry_vk wrote: The "lambda calculus" is not about lack of types (actually, lambda calculus says nothing about types). Lambda calculus is about combining functions to create new functions in different ways. Data types are not step towards the computer hardware, they are actually step away from hardware (in hardware, you have absolutely no types).
Computer languages can be compared on following bases: 1) primitive elements; 2) means of combining primitive elements; 3) means of abstraction (e.g., assigning name to some combination of primitives to be able to use combination as a primitive element). In Lisp, you have rich set of primitive types, you have means of combining primitive types, and you have means of abstracting types. In PicoLisp, you have very small set of primitive types and no means to combine them or to abstract them. This means that you are severely limited in what tasks you can solve. For example, you can't make good image processing library because you simply can't efficiently represent images with lists, integers and symbols.
Also, in lambda calculus you will nowhere hear that functions are represented as lists. You will see some functions, some lambdas, some parameters. Functions semantically are not lists. The whole "function is a list" is just a consequence of trying to fit everything into "everything is a list" dogma. Function is not a list. Image is not a list. Complex number is not a list. GUI Widget is not a list. Thread is not a list. File is not a list. Database is not a list.
I won't disagree with your fundamental point (that PicoLisp is certainly more primitive in terms of data structures than I would like), but I will challenge you on the idea that PicoLisp doesn't allow you to combine primitive elements or abstract them. As long as you have a number data type, some sort of aggregate data type (e.g. a list), and some way to tell them apart (ATOM would do), you can frankly do anything you can do in Common Lisp. It would be slower, more labor intensive, and less efficient at a machine-level, but it's ultimately no less expressive.

For instance, look at all the various data structures that Emacs Lisp cobbles up out of raw lists. Things like key maps and such. In a CL world, you'd probably create CLOS objects for all those, but there is really no reason other than efficiency and convenience to do so (yea, I know, nothing but that stuff... :roll: ). The fact that Emacs does as much as it does with the list type demonstrates that it can be done.

Another for-instance from a different programming language: Erlang doesn't even have real characters or strings. Characters are simply integers and strings are simply lists of integers. As a result, string processing is slower in Erlang than it might be in other languages with an array-based string representation, but it still can be done.

But, in the end, your overall point is valid: why be so primitive when you don't have to? I also disagree with the PicoLisp author who seems to hold minimalism as the greatest trait for a language. While it might be right to accuse CL as being a huge language with a lot of baggage, PicoLisp over-rotates the other direction, IMO. In fact, I would call Scheme minimalist and PicoLisp downright primitive.

I also question the PicoLisp author's benchmarking. I think he relies heavily on routines coded in C for the programs that form the basis of his comparisons. PicoLisp's speed seems attributable to doing as little in PicoLisp as possible, simply dispatching from the interpreter to C routines as often as possible and doing most of the heavy lifting in C. While it's fine to develop a language that way, comparing your interpreter versus other CL implementations that do "real work" in Lisp itself, and saying that you're just as fast, seems like comparing apples and artichokes. At worst it's intellectually dishonest.

All that said, I have nothing against PicoLisp. There are times for interpreters, too, and PicoLisp's author is right to point that out, even if I think his arguments become a bit specious. PicoLisp hasn't chosen a crazy implementation path, IMO. It is what it is, and if it works for its author or anybody else, that's great. But as for me, I think I'll stick with CL or Scheme.
Cheers, Dave
Slowly but surely the world is finding Lisp. http://www.findinglisp.com/blog/

gugamilare
Posts: 406
Joined: Sat Mar 07, 2009 6:17 pm
Location: Brazil
Contact:

Re: Macros considered harmful

Post by gugamilare » Sat Nov 07, 2009 9:14 am

methusala wrote:Thanks for all of the feedback, it's valuable to hear what experienced lispers from around the world have to say on this. Now please allow me to play the "devil's advocate."

Comparing the speed of PicoLisp to the virtual machine based CLisp was not as good a test as comparing it to the machine code compiling SBCL. However Clisp is in widespread use and also probably isn't the slowest lisp. So the test at least supports that the interpreted PicoLisp is comparable in speed to a commonly used byte code compiled Lisp system.
I think you are wrong when you say "clisp is not the slowest lisp". Well, it might not be the slowest Lisp, but I am almost sure it is the slowest Common Lisp. But, yes, I agree it is commonly used and sufficiently fast for many uses, and I don't know any reason to prefer Clisp over an equivalent S-expression interpreter (but not PicoLisp, which doesn't have floats or arrays nor many useful libraries).

But, on the other hand, try creating a complete and powerful application like Firefox in a simple language like PicoLisp, or even in Clisp. I assure you it will not be sufficiently fast for many people who is not using high-end quad core PCs (not to mention older PCs, netbooks, smartphones, ...). And I believe there will always be applications in this edge. Also remember about recent games (which always require speed and are intended for common users, and I believe that this situation won't change in many years).
To put it another way, if the strength and core of lisp is the lambda calculus, shouldn't Lisp programmers be asking computer makers to adapt to us rather us adapting to their computers?
Lambda calculus is purely a theory, its role is to prove things, not to make things work, so I mainly disagree with this. If we would be true to lambda calculus, we shouldn't be allowed manipulate anything but functions (see SKI combinator calculus for a (crazy) example that this is possible).

We must adapt to the computers because otherwise the processor designers would be restrained to our demand and wouldn't be able to produce fast and low energy consuming processors.

I am not saying that interpreters don't have their space, nor that they have a small space (just note PHP for example). But the article's author is saying basically that "compilers aren't needed by common programmers in general" and "interpreters can be fast enough for any program", which is wrong. This particular sentence, for instance:
[...] most of the time is spent in the nodes.
is completely nonsense. This is clearly not the case even for some simple algorithms (specially for loops). Transforming code into machine code is in fact a limiting factor for programmers, but not as much as he tries to make us believe it is.

abu
Posts: 7
Joined: Thu Feb 18, 2010 3:51 am
Location: Augsburg, Germany
Contact:

Re: Macros considered harmful

Post by abu » Mon Apr 12, 2010 9:44 am

dmitry_vk wrote:
Ramarren wrote: Clisp is an example of a slow interpreter - when performance is important, comparison should be made to sbcl and clozure.
Article contains a lot of misguided opinions - e.g. the compiler vs interpreter, data types, dynamic binding, macros.
Sorry for being so late with my response, but I didn't know about this thread.

Anyway, I did some tests with SBCL, and posted a kind of response in http://picolisp.com/5000/-2-F.html.

I'm also open for further discussions.

Cheers,
- Alex

Post Reply