Product: Book - Hardcover
Title: Advanced Compiler Design and Implementation
Publisher: Morgan Kaufmann
Authors: Steven Muchnick
Advances in compiler design do not get much press these days. The reasons for this are unclear, but no doubt the perception that compilers need no further improvement has something to do with this. This book, written by one of the leading experts on compilers, certainly dispels this belief. Once readers get used to the idiosyncratic ICAN (Informal Compiler Algorithm Notation) invented by the author and used throughout the book, they get a comprehensive overview of compilers, especially that of optimization. Compilers for the SPARC, PowerPC, DEC, and Pentium architectures are treated in the book. The predominant emphasis of the book is in optimization, and so a few more recent and important topics in compiler construction, such as partial evaluation, are not discussed. Readers are expected to have a prior background in elementary compiler theory. My primary interest in reading the book was to gain insight into the compilation issues that arise in symbolic programming languages such as LISP and Prolog.
A detailed review of this book cannot be done for lack of space, but some of the helpful aspects and interesting discussions in the book include: 1. The "wrap-up" section at the end of each chapter, giving a compact summary of what was done in the chapter. 2. Generating loads and stores: The author shows how to move values to and from registers using routines more sophisticated than simply loading values into registers before using them or storing values as soon as they have been computed. 3. The main issues in the use of registers, such as variable allocation, efficiency of procedural calls, and scoping. The author lists the different categories that will result in contention for registers, such as stack, frame, and global offset table pointers and dynamic and static links. 4. The local stack frame and its uses, such as holding indexed variables (arrays, etc.) and debugging. 5. The five different parameter-passing mechanisms: call by value, call by result, call by value-result, call by reference, and call by name. A thorough discussion is given of their properties and what languages make use of them. In particular, the author notes that in the languages C and C++, call by value is the only parameter-passing mechanism, but that the address of an object may be passed, thus emulating essentially call by reference. This can be a source of confusion to those who program in C and C++. The most exotic of these mechanisms is call by name, which is a form of "lazy evaluation" in functional programming languages. The author gives a code example of the call by name parameter passing in ALGOL 60. I don't know of any modern practical programming languages that make use of call by name. 6. Shared libraries and the role of semantic linking and position independent code. 7. The compilation issues that arise in symbolic languages, such as LISP and Prolog. These languages typically have run-time type checking and function polymorphism, which gives them their power and ease of use. The author discusses how to produce efficient code for these languages. Since heap storage is utilized heavily by these languages, the allocation and recovering of it is very important. "Generation scavenging" is mentioned as the most efficient method for doing garbage collection in these languages. This method has been advertised in the literature as one that minimizes the time needed for storage reclamation in comparison with other approaches. In addition, the use of "on-the-fly" recompilation for polymorphic-language implementations is discussed. 8. Dynamic programming and its role in automatic production of code generators, as contrasted with the "greedy approach". The author explains the need for "uniform register machines" in the dynamic programming algorithm. 9. Interval analysis and its use in the analysis of control flow. This technique has been used in the field called "abstract interpretation" in recent years, the aim of which is too automatically and intelligently test program code. 10. Dependencies between dynamically allocated objects, such as links between graph structures in LISP and Prolog. The author describes the Hummel-Hendren-Nicolau technique for doing this, which involves naming schemes for locations in heap memory, a collection of axioms for characterizing aliasing locations among locations, and lastly, and most interestingly, utilizes a theorem prover to establish the properties of the data structures. The author emphasizes though that this technique, and others developed for doing dependence analysis of dynamically allocated objects, are very computationally intensive. 11. Individual optimizations, which the author divides into four groups in order of importance. 12. Induction-variable optimizations and their role in loop optimizations. The author shows how to identify induction variables, and how to transform them using various techniques, going by the name strength reduction, induction-variable removal, and linear-function test replacement. 13. Procedure integration and its role in "inlining" procedures in languages such as C++. The author emphasizes the drawbacks in using inlining, such as its impact on cache misses. 14. The trade-off between object abstraction and optimization, which occurs in object-oriented languages such as C++. The author discusses in detail the role of interprodecural optimizations in dealing with abstraction in the object-oriented modular approach to programming, particularly the identification of "side effects" in making procedure calls. 15. Code optimization that takes advantage of the memory hierarchy, such as data and instruction caches, and how to improve register allocation for arrays. The author gives a detailed and highly interesting discussion of scalar replacement for array elements. 16. Future trends and research in compiler design. The author mentions a few which he believes will dominate in the upcoming decade, such as scalar-oriented and data-cache optimizations. Scalar compilation will be he most active research area in his opinion. At the present time, there has been discussion of "intelligent compilers" that will interact with the user to develop optimal code, or even produce correct programs. These compilers will understand the intentions of the program and warn the user if these are violated, as well as reduce the time and cost needed for testing programs.
Product: Book - Paperback
Title: Find It Online, Fourth Edition : The Complete Guide to Online Research (Find It Online: The Complete Guide to Online Research)
Publisher: Facts on Demand Press
Authors: Alan M. Schlein
This is the second of three basic guides by Facts on Demand press that I am very happy to have in my collection and to recommend to others.
Some really top-notch information brokers contributed to this book, and it is a superb reference, well-organized, that lacks a CD-ROM with clickable links or an Online Version to which access can be gained for a fee or from a password in the printed version.
This book is extremely well-developed to the point that it can meet the needs of a first-time researcher eager to become quickly familiar with the ins and outs of the Internet, as well as the more experienced professional that wants a handy reference work to suggest new sources and methods.
The other two books are Helen Burwell's "Online Competitive Intelligence"--the one book to buy if you can only buy one of these three books--and Sankey & Weber's "Public Records Online" (buy only if you have do work in this area or want to protect yourself by monitoring your divorced spouse's assets, etc.)
Product: Book - Paperback
Title: Mastering Digital SLR Photography (Mastering)
Publisher: Muska & Lipman/Premier-Trade
Authors: David D. Busch
The thing I like best about this book is that all the techniques deal with the special needs, strengths, and weaknesses of the digital SLR camera. Other digital photography books I've checked out are too general, and don't cover DSLRs in much detail. This one is written specifically for the digital SLR owner.
For example, in Chapter 2, the author describes eight things you can do with DSLRs that aren't possible or as easy with other types of camera. I put this information to work right away to improve my pictures! In Chapter 4, he explains some pitfalls to avoid with digital SLR viewfinders, as well as how to determine whether your sensor has dust or dead pixels (and how to fix both.)
There's a whole chapter on working with DSLR RAW files, including recommendations for RAW converters for each of the major DSLR brands. I found the chapter on working with SLR lenses particularly valuable, including the clear explanation of "bokeh", the difficulty of obtaining wide-angle coverage, use of image stabilization, and other digital SLR topics. I haven't seen such comprehensive coverage of night, infrared, and ultraviolet photography elsewhere, either.
I recommend this book over other digital photography books that don't emphasize single lens reflex techniques as much as this one does. Most of the coverage is aimed at the most recent Nikon and Canon digital SLRs, but there is lots of information of use to owners of Minolta and Olympus DSLRs, too. If you're moving up from a point and shoot digital camera and want to learn the specifics of DSLR work, this is the book you need.
Product: Book - Hardcover
Title: What the Dormouse Said: How the 60s Counterculture Shaped the Personal Computer
Publisher: Viking Adult
Authors: John Markoff
As all major movements and innovations seem to come out of periods of cultural upheaval so true is it of the computer revolution that brought about the information age. Here we see that Steve Wozniak's Apple one was just an immediate cause the soon to come home computing explosion. It wasn't until brew-club mate Steve Jobs saw that the market was ripe to start selling computers that the market took off. But underlying this well known story of garage-built computing is a much deeper and much more interesting story of how the field of computer science developed in sequence with the intellectual community and how it wasn't until these fields clashed (or symbiotically nurtured) with 1960's psychedelic counterculture as only California could have produced it that the computer science really took off. "What the Dormouse Said" explores how the computer industry needed freedom from the heavy top down institutions of the East Coast and found it in Silicon Valley.
Of course it all started with transistors that TI built into integrated circuits in 1958. This was the essential technology that made the revolution possible and though the IC wasn't perfect it was only a few years before the idea of a home PC was possible. As possible as it was, Digital's CEO Ken Olson said that there was no reason anyone would want a computer in their home. This backward view, like Bill Gates in 1981 when he said there is no reason a PC would require more than 640K of RAM, seems laughable in hindsight yet it was these philosophies, among forward thinking men no less, that probably slowed down the process. It only follows that if these were the innovators closed-mindedness must have been the prevailing stance within the computer science community. Nevertheless progress did happen and thinking that within twenty years of the invention of the transistor solid stat computing was a solid technology it could very well be that these years saw a far greater technological leap than we have seen in the last 20 years.
As always is the case it was midlevel people that truly brought about the computer revolution. These people; the mid-level intelligent doers not the business leaders were able to thrive technically in the environment of the 1960's that questioned everything. This questioning allowed the cutting edge technology industry to break apart from stifling corporate mentalities of the current tech businesses and even universities that were still under the yoke of 19th century corporate mentality to a great extent. It was Stanford University that offered a strange mix of willingness to fund computer research and yet was a hot bead of counterculture. As a university that had a small amount of prestige yet by no means an overwhelmingly stifling atmosphere it was a breeding ground for new ideas. This naturally turned out to be a nurturing atmosphere for technical innovation.
John Markoff, explores this time of innovation that resulted in the fledgling PC industry. The book is less than a narrative and more of a mix of events accounts of people within the industry and researched texts. It is a very fast and interesting read. The connection of drugs and the enhancment consciousness and the idea that computers could augment the human intellect that Doug Englebart apparently had was visionary, though quite possibly accidental. The Drug culture of the 1960's at least opened the door to the idea of a world connected by computers. Reading this book really makes one aware of how visionary and pioneering these young computer scientists really were. I have been a fan of Markoff and his articles for a long time and I see he really put a lot of effort into making this book lucid and vital. This history is very important to us now and it had me call into question weather WWII or the PC revolution was the most important event of the 20th century. The only problem is that the book seems somewhat disjointed and I had trouble following the book at times. Overall I think this book is fascinating and should be required reading for engineering students. I