The January release of Franzs Allegro Common LISP 8.0 puts developers on notice that "exotic" programming tools, long relegated to research environments, are becoming more viable options for mainstream applications. LISP, PROLOG, genetic programming and neural nets are among the technologies increasingly ready for Web-facing roles.
Allegro CL has had two major updates since the last time eWEEK Labs reviewed it. The 8.0 version reflects performance-enhancing improvements so startling that if it played baseball, wed expect a congressional probe.
Interpreted code ran on our Advanced Micro Devices Athlon 64-based test machine at speeds formerly seen only on specialized hardware; compiled functions ran so quickly that well-known LISP benchmarks took too little time to measure.
Older versions of the Allegro environment fell short of competing development tools, even by the lower standards of bygone times. Version 8s source editor, debugger and other coding aids, however, have pulled abreast of the rising standard of the high-end Java environments that are LISPs most likely competitors. These new tools take full advantage of incremental compilation and of the rigorously consistent syntax of LISP to offer developers superb convenience in viewing and managing their work.
Allegro CL includes database interface drivers, XML parsing capabilities, Perl-compatible regular-expression parsing, and a suite of XML-oriented functions and utilities. The centerpiece of the 8.0 release is AllegroCache, a persistent object store thats directly accessible from LISP code and that offers developers both stand-alone and client/server transactional database capabilities. Developers whove thought of LISP programs as being impressive demonstrations but unsuited to production use may find that adding AllegroCache to the recipe creates an entirely different dish.
LISP is not the only example of what some might consider an exotic programming technology thats actually available in a robust and fully supported form. PROLOG, neural nets and genetic algorithms are other technologies that programmers might mentally relegate to the AI (artificial intelligence) hype bubble of the 80s but that some developers are applying today to the kinds of problems that modern enterprise applications most need to solve.
These tools are suited to the building of extensible and adaptive systems, to the recognition of patterns that no one thought to seek out, and to the rapid generation of "good enough" answersespecially where a useful answer right now is preferred to a better answer later.
Like LISP, PROLOG in the 80s got more hype than it could handle. However, implementations such as SICStus Prolog from the Swedish Institute of Computer Science are alive and well and on the leading edge of hot Web applications such as speech recognition. "PROLOG was originally designed for language processingthis isnt as well known as it should beand is extremely good for that kind of job," said Manny Rayner, staff member at NASAs Ames Research Center in Moffett Field, Calif., during a conversation with eWEEK Labs.
Click here to read about NASAs collaboration with SGI to create the worlds third-largest supercomputer.
Rayner is one of the developers of the Clarissa speech-recognition system that assists International Space Station crew members and is thought to be the first speech-recognition system deployed in space. Rayners development stack includes SICStus Prolog as one of the foundations of his open-source Regulus spoken-dialog processor.
The full Regulus stack, said Rayner, enables much more rapid development than statistical approaches to speech recognition: "You can develop a command grammar fairly quickly, without having to collect a huge amount of data."
Larry Deschaine, an engineering physicist with Science Applications International, in Aiken, S.C., seeks meaning in data sets rather than in spoken commands. His doctoral research in machine learning took him on a tour of about 20 software tools that could generate models consistent with data, but not overfitted to specific values to the point of losing predictive power. Deschaine found what he needed in the technique of linear genetic programming, which hes used in applications ranging from the modeling of waste incinerators to the discrimination of unexploded ordnance from other ground clutter.
Deschaine spoke with eWEEK Labs about his use of Discipulus, a genetic programming engine from RML Technologies.
"Weve taken code that would run for weeks or months, even on a remote server, and made it run in milliseconds on a Web page," he said.
Among the programming techniques least familiar to mainstream developers is the simulated neural network, a software mechanism that creates and adapts a drastically simplified simulation of the behavior of biological neurons in connecting with and signaling each other in response to varied input. The technique finds relationships in data that conventional algorithm development methods might fail to uncover.
"I was looking for a software tool that was easy to use and would do forecasting based on complex parameters," said independent health care industry consultant Barb Tawney, who described to eWEEK Labs her use of NeuralTools from Palisade.
"NeuralTools brings to forecasting a new, broad set of capabilities that did not exist in the open market without a lot of specific programming," said Tawney in Charlottesville, Va. "With other packages, you have to be really careful. With the Palisade product, its really hard to ask the wrong question."
Technology Editor Peter Coffee can be reached at firstname.lastname@example.org.For reader response to this article, click here.
Check out eWEEK.coms for the latest news, reviews and analysis in programming environments and developer tools.
Peter Coffee is Director of Platform Research at salesforce.com, where he serves as a liaison with the developer community to define the opportunity and clarify developers' technical requirements on the company's evolving Apex Platform. Peter previously spent 18 years with eWEEK (formerly PC Week), the national news magazine of enterprise technology practice, where he reviewed software development tools and methods and wrote regular columns on emerging technologies and professional community issues.Before he began writing full-time in 1989, Peter spent eleven years in technical and management positions at Exxon and The Aerospace Corporation, including management of the latter company's first desktop computing planning team and applied research in applications of artificial intelligence techniques. He holds an engineering degree from MIT and an MBA from Pepperdine University, he has held teaching appointments in computer science, business analytics and information systems management at Pepperdine, UCLA, and Chapman College.