views:

483

answers:

9

I have read that most languages are becoming more and more like lisp, adopting features that lisp has had for a long time. I was wondering, what are the features, old or new, that lisp does not have? By lisp I mean the most common dialects like Common Lisp and Scheme.

A: 

Decent syntax. (Someone had to say it.) It may be simple/uniform/homoiconic/macro-able/etc, but as a human, I just loathe looking at it :)

Brian
There was an attempt at the so-called "M-expressions", but it just never caught on. And of course there's Dylan.
Frank Shearar
Anyone who has spent more than 10 minutes dealing with Lisp has learned to ignore the syntax. Just like with every other language, programming or otherwise.
jrockway
If "indecent syntax" was the price I had to pay to write 1/10th as much code, I'd be willing to pay that. But perhaps I learned Lisp when I was too young, because I never understood why I should loathe it. You put the first paren before the word `defun` instead of after it, and use () instead of {}, and otherwise it's pretty much just the same as most other languages. I don't understand why that ties people in knots so.
Ken
It ties people in knots because they read articles about "how lisp is so awesome111", but don't want to use it. So they say, "yeah, i'm sure it's awesome, but I just can't get over the syntax", when they actually mean, "I like PHP and new things scare me." Personally, I like Lisp's syntax, and I also like other langauges' syntax.
jrockway
(Just for the record, I taught a class in Common Lisp (http://www.cc.gatech.edu/classes/cs2360_98_summer/ - even had fun homework - see homework 4), but even after that I still can't get past the darn parens.)
Brian
The syntax is not so bad as long as you are using a decent editor that shows you matching parenthesis. Once you have that its just like working with any other language, but without a good editor I agree the code is impossible to work with.
Justin Ethier
jrockway: I'll be the first to admit that new things scare me. Fortunately Lisp predated PHP by a few decades, so at least I'm safe on that count. :-)
Ken
@jrockway I've spent more than 10 minutes with perl and php, and the horrible irregular syntax didn't go away.
Pete Kirkham
Tell someone who cares?
jrockway
@jrockway your statement that syntax is not significant is false in my experience. If you don't care whether or not your statements are borne out in practice, what value do they have?
Pete Kirkham
What value does anything have? What is the meaning of life?
jrockway
A: 

It's missing a great IDE

Mel Gerats
Uh, Emacs? CL + Emacs blows Java + Eclipse out of the water.
jrockway
Emacs + SLIME feels like a miracle from the POV of C-style development. Still, Emacs could use a decent text editor. :p
guns
@jrockway Seriously? Code editor != IDE
Frank Krueger
You don't know much about Emacs.
jrockway
@Frank Krueger: SLIME is the part you're probably missing.
Frank Shearar
When I first started emacs, I was looking for the help, found something that look like it was an agent based help system like clippy, and it tried to psychoanalyse me.
Pete Kirkham
+1  A: 
  • It can be harder than in more popular languages to find good libraries.
  • It is not purely functional like haskell
sfg
Arguably purely functional languages lack the mutability feature. I can't see how you would add a feature to lisp to make it purely functional; you'd have to remove stuff instead.
Pete Kirkham
If purely functional is what you like, take a look at Clojure, the most important new member of the Lisp family.
Dan Weinreb
+9  A: 

This question has been asked a million times, but here goes. Common Lisp was created at a time when humans were considered cheap, and machines were considered expensive. Common Lisp made things easier for humans at the expense of making it harder for computers. Lisp machines were expensive; PCs with DOS were cheap. This was not good for its popularity; better to get a few more humans making mistakes with less expressive languages than it was to buy a better computer.

Fast forward 30 years, and it turns out that this isn't true. Humans are very, very expensive (and in very short supply; try hiring a programmer), and computers are very, very cheap. Cheaper than dirt, even. What today's world needs is exactly what Common Lisp offered; if Lisp were invented now, it would become very popular. Since it's 30-year-old (plus!) technology, though, nobody thought to look at it, and instead created their own languages with similar concepts. Those are the ones you're using today. (Java + garbage collection is one of the big innovations. For years, GC was looked down upon for being "too slow", but of course, a little research and now it's faster than managing your own memory. And easier for humans, too. How times change...)

jrockway
One correction: Lisp is (a little more than) 50 years old.
Eli Barzilay
This is why I said "Common Lisp", which is only really recognizable going back to the mid-80s. (Lisp machine lisp, maclisp, elisp, etc.)
jrockway
Well, the big advantages of the whole lisp family are arguably things that precede Common Lisp. In fact, Common Lisp is not too significant as far as innovations go -- its biggest contribution was in combining forces of previous dialects.
Eli Barzilay
CLOS is quite significant as far as innovations go. It was also the first object-oriented language to get a standard. Also Common Lisp for the first time made it possible to write larger software in a dynamic language over a wide range of machines - by specifying a language that had optimization features (like compiler directives, type declarations, and more) built in. All the Lisps before had either semantic problems (compiler and interpreter were behaving different, etc) or were designed with reduced capabilities (Scheme - just the core specified).
Rainer Joswig
programmers cheap in the 70/80s? Which planet? Programmers expensive today? Again, which planet? That Java has GC does not make it similar to Lisp. 'If Lisp were invented today, it would be very popular.' I don't think so. This has been discussed for decades and I think you got most of it wrong.
Rainer Joswig
This answer is so wrong in so many levels that it does not even deserve a rebuttal, but here goes. Programmers cheap decades ago(@rainer_joswig got it right)? If that's the case, why provide GC, lexical scope (and hence closures), CLOS, etc? It's 50 years+ technology and still unmatched by anything short of Haskell and the ML family.
Stephen Eilert
This is incredibly wrong. For example, Lisp had automatic deallocation with garbage collection since 1959. It took a long, long time for there to be "mainstream" languages that worked this way. Until then, expensive programmers had to waste time figuring out exactly when to deallocate things, a hard job when data structures aren't very simple. There are indeed some things that we'd do differently these days, that would makes things simpler, but your overall characterization is utterly wrong.
Dan Weinreb
I don't really understand all the hateful rebuttals. I am not opining on anything, really, just observing what happened. "Industry" decided on C and cheap commodity hardware. This was not good for quality, and bugs in C (or rather, C programmers) brought down the entire Internet a number of times.The reality is that Lisp was ahead of its time, any negatives got stuck in people's minds, and it never really recovered, even though the techincal issues were solved. This is why you are writing Java today and not Common Lisp.
jrockway
+4  A: 
  • Pass-by-reference (C++/C#)
  • String interpolation (Perl/Ruby)
  • Nice infix syntax (though it's not clear that it's worth it) (Python)
  • Monadic 'iteration' construct which can be overloaded for other uses (Haskell/C#/F#/Scala)
  • Static typing (though it's not clear that it's worth it) (many languages)
  • Type inference (not in the standard at least) (Caml and many others)
  • Abstract Data Types (Haskell/F#/Caml)
  • Pattern matching (Haskell/F#/Caml/Scala/others)
  • Backtracking (though it's not clear that it's worth it) (Prolog)
  • ad-hoc polymorphism (see Andrew Myers' answer)
  • immutable data structures (many languages)
  • lazy evaluation (Haskell)

(Please add to this list, I have marked it community wiki.)

This just refers to the Common Lisp and Scheme standards, because particular implementations have added a lot of these features independently. In fact, the question is kind of mistaken. It's so easy to add features to Lisp that it's better to have a core language without many features. That way, people can customize their language to perfectly fit their needs.

Of course, some implementations package the core Lisp with a bunch of these features as libraries. At least for Scheme, PLT Scheme provides all of the above features*, mostly as libraries. I don't know of an equivalent for Common Lisp, but there may be one.

*Maybe not infix syntax? I'm not sure, I never looked for it.

Nathan Sanders
I'm pretty sure I've seen Common Lisp libraries for at least 3 of these. :-)
Ken
This is misleading because "standards" don't mean much here... To make it proper it should be clarified that the biggest advantage of lisp (and scheme) is its ability to change and adapt -- and gain many of these features. For example, PLT Scheme is a lisp/scheme dialect and it has each and every one of these features.
Eli Barzilay
@Ken: Yes, I've implemented a good number of these things badly, myself. Are they features of the language? No. They're libraries.@Eli: I will try to find a way to clarify this--it's a good point because the OP's question is kind of mistaken.
Nathan Sanders
Correct me if I'm wrong but isn't ad-hoc polymorphism implemented in CLOS using defgeneric/defmethod with specialization? I'm just learning Lisp but from what I've read CLOS sounds like it supplies all the same stuff I get out of C++ class hierarchies but in a different/more flexible manner.
Andrew Myers
@Andrew: Could be. I hadn't thought through all the possibilities. Can you give an example? (Or just edit it out yourself if you are confident enough of your answer.) Really this answer needs separate CL/Scheme sections.
Nathan Sanders
@Nathan I'm not that confident in my answer but I'll add an example of what I'm thinking as a response. I would like to add it as part of this comment since it's not really a response to the question but I don't know how to format code properly in comments.
Andrew Myers
If a language allows something to be implemented in a library as if it was part of the language, is there any way in which it's worse that it's not officially "in the language"? That's a big deal in C/Java/etc, where the libraries and the language are hugely distinct, but the main advantage of Lisp seems to be that they're not.
Ken
@Ken: That's why the question is mistaken in the first place. I still answered the question, though, working with the OP's assumptions. If you can integrate your comment into my post above, please do--I think your answer would help the OP's understanding more than the list he asked for.
Nathan Sanders
+1  A: 

This is in response to the discussion in comments under Nathan Sanders reply. This is a bit much for a comment so I'm adding it here. I hope this isn't violating Stackoverflow etiquette.

ad-hoc polymorphism is defined as different implementations based on specified types. In Common Lisp using generic methods you can define something like the following which gives you exactly that.

;This is unnecessary and created implicitly if not defined.  
;It can be explicitly provided to define an interface.
(defgeneric what-am-i? (thing))

;Provide implementation that works for any type.
(defmethod what-am-i? (thing)
  (format t "My value is ~a~%" thing))

;Specialize on thing being an integer.
(defmethod what-am-i? ((thing integer))
  (format t "I am an integer!~%")
  (call-next-method))

;Specialize on thing being a string.
(defmethod what-am-i? ((thing string))
  (format t "I am a string!~%")
  (call-next-method))


CL-USER> (what-am-i? 25)
I am an integer!
My value is 25
NIL
CL-USER> (what-am-i? "Andrew")
I am a string!
My value is Andrew
NIL
Andrew Myers
One difference, IIRC, is that C++-style ad-hoc polymorphism uses compile-time types for dispatching, while CLOS-style generic functions use runtime types for dispatching. Granted, I can't think of any way in which generic functions are worse, but I'm not sure I'd say CL has ad-hoc polymorphism. (I might say "it has generic functions, which are *better* than ad-hoc polymorphism".)
Ken
You're correct about compile time vs. runtime dispatching. I realized last night as I was going to bed that this is not complete ad-hoc polymorphism either. For ad-hoc polymorphism you also need to be allowed to vary the number of arguments, not just their types. So in c++ you can have one overload take no arguments and another take 2, I'm not sure how to do this in CL. I think that it would be possible but a lot of work de-structuring the arguments yourself.
Andrew Myers
C++ definitely does have runtime dispatch. Without that, it would not be an object-oriented languages in any meaningful sense. C++ overloads, which you're discussing, are something else. Look for "overriding" as opposed to "overloading".
Dan Weinreb
+1  A: 
  • Whole-program transformations. (It would be just like macros, but for everything. You could use it to implement declarative language features.) Equivalently, the ability to write add-ons to the compiler. (At least, Scheme is missing this. CL may not be.)
  • Built-in theorem assistant / proof checker for proving assertions about your program.

Of course, I don't know of any other language that has these, so I don't think there's much competition in terms of features.

Noah Lavine
Lisp macros really ARE add-ons to the compiler; they are true language extensions. I'm not quite sure what more you are asking for.
Dan Weinreb
+2  A: 

For Common Lisp, I think the following features would be worth adding to a future standard, in the ridiculously unlikely hypothetical situation that another standard is produced. All of these are things that are provided by pretty much every actively maintained CL implementation in subtly incompatible ways, or exist in widely used and portable libraries, so having a standard would provide significant benefits to users while not making life unduly difficult for implementors.

  • Some features for working with an underlying OS, like invoking other programs or handling command line arguments. Every implementation of CL I've used has something like this, and all of them are pretty similar.

  • Underlying macros or special forms for BACKQUOTE, UNQUOTE and UNQUOTE-SPLICING.

  • The meta-object protocol for CLOS.

  • A protocol for user-defined LOOP clauses. There are some other ways LOOP could be enhanced that probably wouldn't be too painful, either, like clauses to bind multiple values, or iterate over a generic sequence (instead of requiring different clauses for LISTs and VECTORs).

  • A system-definition facility that integrates with PROVIDE and REQUIRE, while undeprecating PROVIDE and REQUIRE.

  • Better and more extensible stream facilities, allowing users to define their own stream classes. This might be a bit more painful because there are two competing proposals out there, Gray streams and "simple streams", both of which are implemented by some CL implementations.

  • Better support for "environments", as described in CLTL2.

  • A declaration for merging tail calls and a description of the situations where calls that look like tail calls aren't (because of UNWIND-PROTECT forms, DYNAMIC-EXTENT declarations, special variable bindings, et c.).

  • Undeprecate REMOVE-IF-NOT and friends. Eliminate the :TEST-NOT keyword argument and SET.

  • Weak references and weak hash tables.

  • User-provided hash-table tests.

  • PARSE-FLOAT. Currently if you want to turn a string into a floating point number, you either have to use READ (which may do all sorts of things you don't want) or roll your own parsing function. This is silly.

Here are some more ambitious features that I still think would be worthwhile.

  • A protocol for defining sequence classes that will work with the standard generic sequence functions (like MAP, REMOVE and friends). Adding immutable strings and conses alongside their mutable kin might be nice, too.

  • Provide a richer set of associative array/"map" data types. Right now we have ad-hoc stuff built out of conses (alists and plists) and hash-tables, but no balanced binary trees. Provide generic sequence functions to work with these.

  • Fix DEFCONSTANT so it does something less useless.

  • Better control of the reader. It's a very powerful tool, but it has to be used very carefully to avoid doing things like interning new symbols. Also, it would be nice if there were better ways to manage readtables and custom reader syntaxes.

  • A read syntax for "raw strings", similar to what Python offers.

  • Some more options for CLOS classes and slots, allowing for more optimizations and better performance. Some examples are "primary" classes (where you can only have one "primary class" in a class's list of superclasses), "sealed" generic functions (so you can't add more methods to them, allowing the compiler to make a lot more assumptions about them) and slots that are guaranteed to be bound.

  • Thread support. Most implementations either support SMP now or will support it in the near future.

  • Nail down more of the pathname behavior. There are a lot of gratuitously annoying incompatibilities between implementations, like CLISP's insistance on signaling an error when you use PROBE-FILE on a directory, or indeed the fact that there's no standard function that tells you whether a pathname is the name of a directory or not.

  • Support for network sockets.

  • A common foreign function interface. It would be unavoidably lowest-common-denominator, but I think having something you could portably rely upon would be a real advantage even if using some of the cooler things some implementations provide would still be relegated to the realm of extensions.

Pillsy
MANY of the things you mention here are available in libraries so well-established as to be as much "part of the language" as are the Java standard libraries. bordeaux-threads, usocket, and cffi are some of them. There's good discussion about the future of Lisp on the <a href="http://ilc2009.scheming.org/>web site I set up for the International Lisp Conference 2009.</a> It discusses many of the issues brought up here.
Dan Weinreb
@Dan Weinreb: Right, and if there were to be a CLtL3 or an ANSI CL1x standardization effort, things that have existing, widely-used and easily-available implementations would be obvious additions to that new standard.
Pillsy
A: 

You are asking the ronge question. The language with the most features isnt the best. A language needs a goal.

We could add all of this and more

* Pass-by-reference (C++/C#)
* String interpolation (Perl/Ruby)
* Nice infix syntax (though it's not clear that it's worth it) (Python)
* Monadic 'iteration' construct which can be overloaded for other uses (Haskell/C#/F#/Scala)
* Static typing (though it's not clear that it's worth it) (many languages)
* Type inference (not in the standard at least) (Caml and many others)
* Abstract Data Types (Haskell/F#/Caml)
* Pattern matching (Haskell/F#/Caml/Scala/others)
* Backtracking (though it's not clear that it's worth it) (Prolog)
* ad-hoc polymorphism (see Andrew Myers' answer)
* immutable data structures (many languages)
* lazy evaluation (Haskell)

but that would make a good language. A language is not functional if you use call by ref.

If you look at the new list Clojure. Some of them are implemented but other that CL has are not and that makes for a good language.

Clojure for example added some:

ad-hoc polymorphism lazy evaluation immutable data structures Type inference (most dynamic languages have compilers that do that)

My Answer is:

Scheme schooled stay as it is. CL could add some ideos to the standard if they would make a new one.

Its LISP most can be added with libs.

nickik