Takes Tai-Chi for a couple of months, thinks he is qualified to talk about 'qi'.
Takes university course that touches on Java, dabbles in Lisp, thinks he is qualified to discuss 'failings' of Java
Begs the question: if Lisp is so amazing and Java is so bad, why is Lisp a failure in the marketplace?
----
Remember that even Lisp's biggest recent (as in, in the ~17 years since Java was released) success (Paul Graham selling his store-thing to Yahoo) was quietly re-written in Java a couple of years later.
If your answer to the success/failure question is that people who use Java are stupid-heads and people who use Lisp are wonderful geniuses, then may I suggest that you haven't solved the dilemma, you've just multiplied the entities by pushing it back a layer.
In any case, nobody cares about Java vs Lisp anyway. All the 'cool kids' in that space (e.g. functional programming on the JVM) are into stuff like Scala and Clojure. To ignore those makes the argument kind of pointless... it is like having an argument between a Dodgers fan and a Patriots fan. The Dodgers fan says the Dodgers are better because they hit more home runs than the Patriots, and the Patriots fan says the Patriots are better because they score more touchdowns.
It's just fundamentally stupid. Whereas if you compared Lisp and Scala, then at least you'd be talking about teams playing the same sport. Yes, it's still going to be as useful as watching a Patriots fan and a Dallas Cowboys fan argue about which is better, but at least the argument isn't based on fundamentallty stupid premises in that case.
As someone who has programmed professionally in Java, Scala, and Lisp (including Clojure), I might be qualified to assert that your sports analogy is off the mark. All of these programming languages can be effectively used to solve many of the same problems, though each language may excel at addressing certain kinds of problems better.
For me, programming in a dialect of Lisp is just more fun than programming in the other languages. Java is not fun at all, in comparison. And that's largely due to the fact that it can be difficult in Java to write reusable code. Consequently, you either have to repeat yourself a lot or tie yourself up in knots to figure out a way to not repeat yourself. Repeating oneself is tedious and boring.
On the other hand, static type checking can save you a lot of time debugging. But it can also slow you down. It's probably a net win overall for most problems, but for some problems, it is too restrictive. I find Scala to be a happy medium for most problems, but I often switch to Python for small programs, where the freedom from static typing can make development much easier and faster.
I think the problem with Lisp is that most people just don't "get it". I'm not sure why they don't, but perhaps it's a fact that I just have to accept. On the other hand, I firmly believe that the state of software engineering would be greatly improved if most programmers did get Lisp. I can't say the same thing about Java, however. In Java there isn't really much to get that you wouldn't get by learning any other OO programming language.
Came across this while researching Turing's proof that a turing machine can compute anything that is computable†. It's Church, comparing his and Turing's approaches:
> computability by a Turing machine ... has the advantage of making the identification with effectiveness in the ordinary (not explicitly defined) sense evident immediately. (1937a: 43.) http://plato.stanford.edu/entries/church-turing/
If we see imperative languages (e.g. Java) as descending from Turing machines, and functional languages (e.g. Lisp) as descending from Church's lambda-definability, the same observation seems to apply. It's not that Turing's approach is better, just easier to "get".
† "computable" meaning "can be computed" (not "computable with a turing machine"). Turing doesn't prove this formally - it's an intuitive appeal, and no exceptions have been found. See section 9, page 249 (page 20 of the pdf) https://docs.google.com/viewer?url=http://www.cs.virginia.ed...
Your analogy doesn't make sense, Lisp and Java are both programming languages. Why not compare them? Also, learning Common Lisp can help immensely if you plan on doing any hardcore Clojure meta programming.
Of course it makes sense, Baseball and Football are both sports, why not compare them? Because as different sports it isn't really relevant. Are you going to say that cricket is better than tennis because they can hit the ball further? It's a stupid argument. Just because two things belong in the same broad category doesn't mean that you can meaningfully compare them.
It is especially specious to bash Java for not being a functional language, because it doesn't try to do that. Java is heavily OO, not functional. So let's say that functional languages are 'better'. Let's just take that as a given. Why then would you single out Java and not mention any of the other languages that are also not functional? Why not bash Smalltalk, or Objective-C?
To single out Java simply underscores the irrelevance of the comparison.
Hence, rather than compare teams from two different sports, you should compare teams from the same sport. Likewise, if you want to make the claim that Common Lisp is the ultimate functional programming language, you should compare it to another functional programming language, not an OO language, or something like Prolog or COBOL.
Why then would you single out Java and not mention any of the other languages that are also not functional? Why not bash Smalltalk, or Objective-C?
There's nothing specious in an argument that isn't as general as it might be. If I soundly argue that red space ships are faster than blue cars, it's still a sound argument, even if I might have also successfully argued that all space ships are faster than all cars.
In any case, Java is often singled out because it is egregiously bad in quite a few ways, but arguments that claim that Lisp excels in certain ways compared to other languages, would typically apply against Smalltalk and Objective-C. (Though Smalltalk is dynamically typed, so that might be a putative virtue that Java doesn't have.)
A lot of people are under the misapprehension that Lisp is a failure in the marketplace, when in fact many industrial applications use Lisp. Start with the following list: http://wiki.alu.org/Industry%20Application
No, the better way to rebut this claim of ineffectiveness in the market would have been to say "which Lisp?"
The fact is that Lisp has spawned a great number of descendants. So to compare modern Java to some ancient version of Lisp from the 70s would be no more relevant than comparing modern Java to C++98.
But even that has it's problems for the Lisp fans, because if you aggregate all the work being done in various dialects of Lisp, it is still no more than a pimple on the ass of the donkey that is Java.
----
Your rebuttal fails specifically because the claim is not that work cannot be done in Lisp, but rather that for various reasons hardly any work is actually being done in Lisp. You're trying to disprove the strawman that no work is being in Lisp, congratulations your tilting at windmills was successful, but that was not the claim that was being made.
----
The question of why Lisp is not more popular is actually really interesting. Let me advance one theory - that Lisp programs devolve too easily into domain specific languages, and that impedes the transmission of ideas to other programmers. E.g. if I write some awesome Lisp code to solve problem X, it is hard for you to figure out what is going on until you understand what is going on - a chicken and egg type problem.
Whereas languages which remain mired in the foul stench of their own syntax and keywords are easier to understand because if I know the language (Java or COBOL or VB or C++ or whatever) then I am already halfway towards understanding the solution because the language it is expressed in is relatively static.
This feels like a lot of blog posts I write: the main content is probably there, in the author's head, but not enough of it was put on the paper. What we read instead is a slice of what I have no doubt in my mind to be a fantastic essay.
I have recently begun a love affair with Racket and for a reason I think that the author was ultimately trying to get at: the core syntax of any Lisp is a minimalist tree and the semantics are lambda calculus (and abstract algebra to help motivate types).
An REPL for Java could conceivably be written; AI is language agnostic; and plenty of languages let you eval generated code; but Lisps are a window into computation itself and it's very addicting.
Admittedly I skimmed the blog post but I saw a complaint against the "Hello World", not how to print out a string. I agree the "System.out.print" thing is such a non-issue. If it really bothers you, make a class in your package that's a wrapper around it so you can call Utl.print() or U.o() or however short you want to make it. Meanwhile the rest of us will get back to programming. In the few years I've done a lot of Java development it's never bothered me enough to even make a vim shortcut for it.
However, the "Hello World" criticism does make sense to me in this sense. There are at least 18 separate facts you need to know to fully understand a Java "Hello World" program (including how to compile and run it). For [insert almost any other language (not C++)] there is considerably less. (Python is 4 facts.) If your goal is to teach someone who has never programmed before a new language, Java seems like a poor choice in that respect.
In any case, Clojure is where it's at right now man. It's pure awesome-sauce on its own and you get all of Java's goodies as a bonus.
Seriously. I've been doing Java for years and it had been so long since I wrote public static void main etc. that I had to look it up. If you're writing mostly classes to be run straight-up from the command line, you're doing it wrong.
Actually, running the class straight from the command line can be a really cool way of doing ad-hoc unit testing.
You can have a main method that instatiates an instance of the class, populates it with whatever it needs (including mocks if you're into that sort of thing) and then exercises the various methods of the object.
It's like IoC but without all the bread and circuses.
NB: it's been about 8-10 years since I did anything like that.
Running from the command line is good for something that can run headless but you want to see immediate success/failure type messages.
The other main use I had for programs run from the command line was small 'one-off' file format conversion programs. Once you've done half a dozen of these it's relatively easy to just bang out the ~100 lines of code to do this (including proper exception handling).
Would it be faster to write it out in perl? Well, if the regex is sufficiently simple then yes, if it is moderately complex then you still use far fewer keystrokes, but getting it to the point that it is correct, that's easier in Java. (Assuming you optimise for readability, something perl... struggles ... with)
Speaking for CLI utilities (not apps in general), startup time is a killer. Python, Ruby, Perl etc are so much faster (despite Java improvements) since they began as "scripting" languages. One solution is to reuse a warmed-up JVM (even faster if also reusing already-loaded classes):
1. use a Java REPL, like BeanShell or Groovy (or write one).
2. call a Java server from the CLI
I did (2) for a while, but not worth the saving of a second or so.
BTW: I find "you're doing it wrong" is so unreasoned that best ignored
And if you use Spring, then the start up time is even worse! I do wish the JVM had an option that would make it start up instantly, even if it meant that the program would run slower. I wonder if the slow startup time has to do with the JVM having to dig through .jar files to find stuff. I'm also surprised that there isn't a way to cache the machine code generated by the jit so that the machine code doesn't have to regenerated on every run. It seems very strange to me that Java was invented by a Unix company, and yet it seems rather Unix-hostile.
For the stuff that I'm working on, the 1 second startup time is no biggie though. One CLI thing I wrote, for instance, crunched on genomics data for 3 minutes before finishing. By tuning tight loops and the like, I was able to get that down to 15 seconds, but even so, the extra second there is no biggie. Except for when the user uses the "--help" option. That extra second is tedious.
Yeah, I'm puzzled by them not caching, e.g. they could copy an image of the JVM in its starting state.
Maybe the "unix hostility" is due to platform agnosticism ("run everywhere"), servant of two (or three) masters. e.g. issues in file libraries and AWT. But py/rb/perl manage this OK... so idk.
BTW: bad practice to split up functionality (and loses many benefits of Java), but a performance solution would be a script that handles the "--help", and for anything else starts the JVM.
I've actually used that hack for "--help", but I mostly use command line parsing libraries (e.g., JCommander) that generate the usage message for you. The users will just have to put up with the slight delay....
Yeah, it's a pain to setup and maintain. Bizarre idea: a source transformer that knows about a CLI parsing library, to detect your usage of it and create the help option in a scripting language - and also provide the packaging (for the scripting language to start the JVM). Like Google's GWT for compiing a Java subset to JavaScript.
Seems a bit of overkill, but I like the idea of a transformation operating above the level of one language...
I suppose the logical conclusion is compilation of arbitrary Java source to a scripting language, according to the performance characteristics required. It's nice that it would retain semantics and be transparent to the user, while rebalancing performance trade-offs.
Yes, Scala is a JVM language, and all JVM languages that I am aware of have this start-up delay. One of the advantages of the JVM is that many different languages can interoperate easily, but if you do source-to-source translation to a non-JVM language, then the origin languages (e.g., Java) that have this extra support become preferred, and thus discourage the adoption of the better languages. E.g., GWT prevents us from completely migrating to Scala.
It seems as if it would be better to have have a JVM byte code to Python byte code translator, or something like that, if such a thing is possible. But I should think that it would be even better and easier to just have a version of the JVM (or an option on the JVM) that starts up much faster.
If you're happy with just Java, doesn't BeanShell start up more quickly? (I don't know, because I've never used it.)
Yes; or a JVM that starts up faster, and split the code between them in the same way.
I'm assuming there's a fundamental trade-off between startup delay and average speed, or we'd have had an instant option long ago. Um, if the problem is loading etc jars, this mightn't work.
But one could still detect byte-code calls with hard-coded string arguments (for "--help" and text) and compile that to python.
Perhaps I should have clarified and added some more nuance. If the vast majority of the classes you write are, by design, intended to be directly run from the command line as their primary mode of execution, another language may be better-suited to what, to me, would sound like shell operations. Groovy, Python, or others may prove more concise and maintainable in such circumstance.
Actually, writing code that can be called directly from a prompt is the essence of using a REPL. The fact that Java, and OO programming generally, makes that so hard is a major weakness.
"A year and a half ago, a dear friend of mine send me this link."
pg's "Beating the Averages" is not known widely enough. We must promote it more. We owe it to our friends who toil under inhumane conditions and work with sadistic technologies.
BTW, pg, I'm taking my "Hackers and Painters" to PyCon 2012. Will you have time to write a dedication? That is, if I get the book back from the colleague I lent it to. ;-)
Well if the author took the same care in determining that Lisp is better than Java that he did in "confirming the existence" of real life magical powers through Tai-Chi then why wouldn't I listen?
I understand that Java is verbose and does not have the same "instant" feedback loop that a language with a fully supported REPL has. But I do not agree with the assertion that Lisp would be a good substitute for Java as the core instruction language in a University course. If people are having trouble getting their head around Java, I think they would struggle with Lisp. Java, in it's simplest form can be written in a very imperative manner. Most people think sequentially, and if you are getting started with programming, writing a simple little program in Java, IMO, is easier than something like Lisp.
I am not saying Java is better, just that it has less cognitive overhead than Lisp.
I studied both Java and Haskell at University and found it much easier to get going with Java than with Haskell (I hacked lots of QBasic - I am getting on a bit - when I was a kid). Actually, I think a language like Ruby or Python would be much better suited for an undergraduate / dual major than Lisp or Java!
The authors of How to Design Programs use Lisp (well, Racket/Scheme) and they are some of the most careful and thoughtful CS educators, who are appear to be thinking deeply about CS pedagogy (at least, this is the impression I get from reading some of their papers and some of HtDP). So I wouldn't be so quick to write off Lisp as an intro language over Java.
"I studied both Java and Haskell at University and found it much easier to get going with Java than with Haskell (I hacked lots of QBasic - I am getting on a bit - when I was a kid)."
I think Java is arguably closer to Basic than it is to Haskell, so this isn't really a very fair or representative data point. (I don't claim that non-programming undergrads taking CS 101 would be more successful if started with Haskell instead of Java, although I believe that's not unheard of).
Thanks for your comment. I will take a look at the book you recommend.
To be honest, it has been a while since I was an undergraduate!
On a related note - how easy is it to transition from having first learnt how to program in Lisp (or a functional language) to programming in Java or C++? I have seen lots of blog posts about doing it the other way around (Java to Lisp). I would imagine that also requires a cognitive leap ..
Lisp (Scheme) was the introductory language taught for CS and EE majors at MIT when I started as an undergraduate, and I found it to be the most intuitive language I had ever seen.
More recently MIT switched to Python.
Either one is a much better choice that Java. I would never have gone into CS if MIT had started us off on Java.
The claim has been made that it is easier to move from a formal education in maths to functional programs.
E.g. if you've been dealing with things like f(x)->g(x) in math class for a couple of years, then a functional language is a more natural fit.
This is why the iconic learning lisp examples are calculating numbers from something like the fibonacci sequence or factorials, rather than something that is actually useful in the real world. (Which is not to say that fibonacci numbers and factorials aren't useful, but rather that those problems are well and truly thoroughly solved already)
The first thing that anyone who doesn't know Java complains about is running a main method, which no one in the Java world does 99% of the time. This is probably the first thing you have to do in a Hello World Java tutorial, but has nothing to do with Java.
There are some poor design choices in Java like no method overriding, mix of primitives and their boxed equivalents, lack of closures, or first class functions, confusing equals semantics, generics verbosity, etc. But main method is just moot.
Also it has nothing to do with superiority of Lisp over Java. Lisp has a different philosophy it's a functional language, while languages like Java, C, Ruby, Python etc are all imperative (see http://en.wikipedia.org/wiki/Functional_programming#Comparis...). I think a better article would be about the virtues of each style and when to use which.
One can program in Lisp in an imperative manner very easily and likewise one can program in Python in a functional manner very easily. The big difference between Lisp and Python (or Ruby) is not the functional nature of Lisp, but rather it's homoiconicity.
Re Java, I write CLI programs in Java all the time. It's fine for that, other than the annoying 1 second startup time.
It is too bad your example does not really convey anything interesting about the differences between clisp and Java.
The only thing I get from your essay is that Common Lisp has a REPL, which is sad because clisp is actually fun and interesting, and has a lot of VERY DIFFERENT things going for it.
Size of source code has never been a measure of the power of the language.
I agree a "hello world" program is a good place to start learning but a terrible place to start comparing. To be fair, though, to really show the benefits of Lisp style programming you'd have to write something quite a bit longer. Long enough, in fact, you'd have trouble enticing people to read it.
>Lispers like to pick on Java. It makes sense, because Common Lisp has everything that Java lacks.
But Java has a lot of things Common Lisp lacks, too. A GUI package. Network programming. Cryptography. Yes, you get modules for all those things. But having them in the core language means you know they're always available no matter where your code is running.
I like Lisp, even with its painful syntax and poorly named internals (Seriously. "cdr"? "endp"? Was there a Great Keystroke Shortage in the '60s?). But it doesn't have the ecosystem for actually producing a product to anywhere near the extent Java does. There's a reason Java is used so much more widely than Lisp.
In terms of syntax, I like what Rich Hickey did with Clojure. You still have all the Lisp functionality but it's a lot easier to read.
There are reasons Java is used widely, just like there were reasons Cobol was used widely.
Now, car and cdr are just vestigially named, that is true. On the other hand, things like endp actually make sense--the p is a convention telling you that endp returns a boolean. It's short for predicate. I personally like Scheme's convention to name procedures like that end?, but endp also makes sense.
Also, the gui package in Java, for example, isn't really part of the language; it's part of the standard library. I don't see why that is a particularly big advantage, especially because you are not guaranteed to have it everywhere your code will run--for example, I don't think you have access to all of those libraries on Java ME.
Ease of reading is also a matter of habit. I actually find lisp code easier to read than most Java code, largely because the Java code tends to have a lot of random cruft (anonymous inner classes implementing a single method, for example).
> Seriously. "cdr"? "endp"? Was there a Great Keystroke Shortage in the '60s?
Well...yes? When jaw-dropping storage is a 1 MB hard disk, things like function names really do make a difference, especially when they're called so frequently. For the same reason, C89 identifiers are only 6 characters long (and hence the C naming convention of things like "strdup" rather than "duplicateString").
Sure, because you can fit six characters in 32 bits. But still... those constraints are no longer with us, and there's a limited amount of historical cruft people learning a language today should have to deal with.
"(Seriously. "cdr"? "endp"? Was there a Great Keystroke Shortage in the '60s?)"
To be fair, CL has "first" and "rest" as well as "car" and "cdr". And as already pointed out, "endp" makes perfect sense if you want to establish a convention for predicates but don't want to allow symbols to contain special characters like ? marks.
(I know that CL allows symbols to contain ?s if you surround with with || but I don't think anyone seriously uses something like (defun |end?| (x) ... )
Java has good libraries and some interesting concurrency primitives, and it's a major improvement over C++ (damning with faint praise) but the language is so verbose and adverse to intelligent programming techniques that I find it nearly unusable.
I have a theory about Java. In the '90s, it was not unheard-of for programmer productivity (individually and for the group) to be measured in LoC. Java's great advantage is that it's easy to be "productive" (100+ LoC/day) without accomplishing or thinking very much. An actually productive day can leave behind an eye-popping 750 LoC.
Reading that code later is another story.
One language that presently surprised was Scala. I expected to hate it based on the way it was presented to me, and found most online tutorials pretty weak, but the language is really cool. I definitely think it's "an acceptable ML", and the fact that it can leverage everything good about Java (including the libraries) is nice.
>I have a theory about Java. In the '90s, it was not unheard-of for programmer productivity (individually and for the group) to be measured in LoC.
Nah, even by the mid '80s KLoC was discredited as a metric for programmers.
Java had a couple of big advantages over what had come before. For one thing, it promised, and mostly delivered, a true cross-platform ecosystem. It had a GC and the VM was written with hooks for IDE-based development. And it had a standard GUI package. C++ had none of that.
Java is verbose, but it's verbose because you have to be explicit with everything, which makes the code a lot easier to follow if you didn't write it. Poorly written Java isn't that much harder to follow than well written Java. You can't say the same about C++ or Lisp. Java is a tool for an organization where half the people are below average. Unless you're working for a very small startup or a flavor-of-the-month place like Google, that's probably going to be the story where you work.
EDIT: Oh, and it's been so long I forgot one of the major, major selling points. You could get free binaries. Almost everything else either cost money or had to be compiled (which was a royal pain in the ass in most cases).
What I meant by that is Google has such a reputation as a place to work that they can afford to hire only the cream of the crop. In other words flavor of the month for job applicants.
I agree about Java's mediocritizing effect, and that it's beneficial if you have a team of low-skill (0.6 to 0.8, on the scale described here: http://michaelochurch.wordpress.com/2012/01/26/the-trajector...) programmers. The question is: why are people hiring hundreds of weak programmers in the first place? When a team of 100 mediocrities achieves as much as a (far less costly) team of 20 good programmers in Java or (better yet) 6 good programmers in a decent language, why hire the former at all?
The Java+IDE environment seems to be designed for 0.6 to 0.8 programmers under the assumption that it's OK for them to stay at that level indefinitely. That's one thing I really dislike about the Java+IDE environment: it encourages people not to learn how things really work or to get better. It's founded on the assumption that 90% of programmers are unskilled (true) but instead of trying to improve them, it coddles them and keeps them mired in mediocrity. That whole Java+IDE setup is based on a self-fulfilling prophecy that (a) 90% of programmers are mediocre and will stay that way, (b) no one reads code, so it's okay for small programs to be inappropriately split into 10+ files, (c) programmers don't learn outside of work or want to learn new languages, (d) all code turns into shit, so there's no point in making beautiful code possible, and (e) the lifespan of a developer is 5-10 years. I don't agree with any of these.
One thing I continue to find amusing is that Clojure and Scala developers learn about Java faster, per unit time, than most Java developers. This is because, when you have a real language to work with (and a REPL) you can tackle harder and more interesting problems.
I can't prove this, but I think that anyone who is smart enough to be effective, in spite of the resistance Java throws in the way, can easily become a 1.2-1.5 programmer in a better language. The argument about "stupid" programmers who just can't grok functional programming is silly. If they can handle Java bullshit, they can learn the simpler abstractions on which decent programming is built. I definitely feel stupider when I have to use crappy languages and can't express my thoughts properly, but I hope it's pretty obvious that this "stupidity" is transient.
I agree on Java being an improvement over C++, though. C++ is a disaster.
>The question is: why are people hiring hundreds of weak programmers in the first place? When a team of 100 mediocrities achieves as much as a (far less costly) team of 20 good programmers in Java or (better yet) 6 good programmers in a decent language, why hire the former at all?
Because 90% of the people are in the bottom 90%. Not every company can hire programmers in the top 10%. That's a mathematical impossibility. Yes, we'd all love to hire only geniuses, but what do you do when you put out a job req and no geniuses apply? Refuse to hire anyone?
Harder: expand your "social surface area" by going to conferences, because the best programmers find jobs (just as the best jobs are filled) through social connections before anything is "posted". You actually have to network a lot harder when you're hiring than when you're looking for a job.
Scumbag blogger:
Takes Tai-Chi for a couple of months, thinks he is qualified to talk about 'qi'.
Takes university course that touches on Java, dabbles in Lisp, thinks he is qualified to discuss 'failings' of Java
Begs the question: if Lisp is so amazing and Java is so bad, why is Lisp a failure in the marketplace?
----
Remember that even Lisp's biggest recent (as in, in the ~17 years since Java was released) success (Paul Graham selling his store-thing to Yahoo) was quietly re-written in Java a couple of years later.
If your answer to the success/failure question is that people who use Java are stupid-heads and people who use Lisp are wonderful geniuses, then may I suggest that you haven't solved the dilemma, you've just multiplied the entities by pushing it back a layer.
In any case, nobody cares about Java vs Lisp anyway. All the 'cool kids' in that space (e.g. functional programming on the JVM) are into stuff like Scala and Clojure. To ignore those makes the argument kind of pointless... it is like having an argument between a Dodgers fan and a Patriots fan. The Dodgers fan says the Dodgers are better because they hit more home runs than the Patriots, and the Patriots fan says the Patriots are better because they score more touchdowns.
It's just fundamentally stupid. Whereas if you compared Lisp and Scala, then at least you'd be talking about teams playing the same sport. Yes, it's still going to be as useful as watching a Patriots fan and a Dallas Cowboys fan argue about which is better, but at least the argument isn't based on fundamentallty stupid premises in that case.