Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Forgive my ignorance, I'm actually curious. How did AI failures contribute to FP, garbage collection, and OO?


In functional languages we have the modern motivations in two parts. Well one really, but one of the sub parts is so big it deserves a section of its own.

Contributions to AI in the form of automated mathematicians is a descendant of Axiomated Set Theory and the desire to automate mathematics in the form of rigorous mechanical proofs. From Robin Milner we have the invention of ML whose actual purpose was as a scripting language to aid in the creation of Automated Proofs [1]! We also have contributions from AutoMath [2] which had a lot of modern cutting edge techniques such as dependent types in the late 1960s. All in all the original goal of automatic mathematics and reasoning proved impractical but the tools needed for their development have proven anything but.

Then there is Lisp. Lisp offered to both functional and object oriented techniques. Lisp was inspired by the lambda calculus which was also an offshoot of the whole mechanize maths movement. It was invented by John McCarthy - whose interest was in the old AI notion of symbolic computing - as a turing complete way of representing algorithms. A very old language its influence on all subsequent history is undeniable. From Lisp we see that: "Garbage collection was invented by John McCarthy around 1959 to solve problems in Lisp" [3]. We also notice that the AI researchers had concepts of object orientation in the early 60s [4] and that the later mixins and multiple inheritance had a root in Lisp. The development of multiple inheritance was also motivated by the needs of the old symbolic reasoning researchers.

As for Walmart? Walmart is a very big user of optimization theory and logistics. Walmart and its ilk also use Data Mining and Rule Learning stuff. And optimization is a lot more important than efficiently routing trucks: OR people are actually laying the foundation of AI and are seeing a lot of their work rediscovered. OR people also contributed dynamics programming for example. This is what I mention in another post about improving synergy between all these very related silos. These topics are essentially subsets of a more general theory that will be pivotal to AI.

I personally think optimization is a very important part of our universe. In a way the principle of Least Action is doing some kind of optimization. I feel that people working in machine learning, in particular the bayesian inference and graphical models part of AI are actually doing a subset of quantum mechanics. Vice versa these techniques will likely prove pivotal in programming quantum systems.

[1] http://en.wikipedia.org/wiki/Robin_Milner#Contributions

[2] http://en.wikipedia.org/wiki/Automath

[3] http://en.wikipedia.org/wiki/Garbage_collection_(computer_sc...

[4] http://en.wikipedia.org/wiki/Object-oriented_programming#His...


And Walmart?


AFAIK operations research and supply chain management use techniques originating from AI.


AFAWK (As Far As Wikipedia Knows), operations research as a formal area kicked off in 1937 and had 1000 people working on it in Britain during World War 2. It may in fact be true that some techniques from AI have cross-fertilized but the credibility of the claim that "we owe operations research and supply chain management to AI" is very low.

What's next: Minsky traveled back in time in a LISP Machine and impregnated the Rev. Bayes' mother?

I'm willing to give 'em Prolog (take it, please!) but the confluence of FP and AI might owe more to the fact that AI (especially 'strong AI') was considered a Big Deal early on in computing and as such, ideas that were invented/used contemporaneously tend to be associated with AI.


The OR folks would say it's the other way around.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: