John Koza has something like three or four books on Genetic Programming. It's all in Lisp and I enjoyed working with different examples from his books. If I ever have enough free time, I'd like to go back and work with it again for a while.
Thank you for taking the time to drop this link. I had exactly the question I think you anticipated: "Where can I get a really well-made washer and dryer?"
This has been exactly my strategy when writing software for medical devices, in my case, ultrasound systems.
You have to help other developers understand how and when to use asserts, and then you have to test the developed device very well, so that it won't assert in the field.
Yet if it does assert in the field, it's highly likely that it will get a lot of management attention.
Here's my take. Asserts are a kind of error handling. They handle situations where the code can tell that it's off the rails. It might be off the rails due to a hardware error or a software implementation error, e.g., a 'can't happen' situation.
We didn't have debug-only asserts. Asserts were enable in the software we verified and shipped.
It took a while for developers to be able to determine when a situation called for an assert, and when it called for what might be called traditional error handling.
The strategy of shipping with asserts enabled kind of worried some folks. They were concerned that it might assert in front of a customer. I understand the concern, but in our domain, if you're doing an OB exam with the ultrasound system and you have a choice of asserting or showing a femur length measurement of -1039cm, which is better?
We didn't have many asserts in the field. We had a lab where we had about 25 - 30 ultrasound machines running embedded tests constantly. Each machine was connected to a JTAG debug board, so we could flash new code into the system and so we could set a breakpoint on the assert routine itself and save a "core dump" for subsequent debugging without having to try to reproduce the system state that led to the assert.
The whole lash-up evolved over a period of years. It worked well, so far as I know.
One mitigating factor was that our systems were class B devices. They were always supposed to be used by a medically trained professional that had the ability to take over if the system asserted, or power failed, etc.
I'm sure this isn't an original idea, but I've always understood that an assert is intended for the developer, an error is intended for the user. Sometimes an assert can be by itself because you know that the invalid condition will be 'absorbed' (for want of a better word), and sometimes an assert is followed by an error because you want the developer to see more than what the error will display to the user.
I have had this book for many years. I remember the toy expert system ("Otto") and enjoyed learning from the example. If I remember correctly the author makes good use of CLOS in the book.
Looks neat and I'd like to try it. I wonder how well it would work from within a GNU Emacs shell buffer (comint based, I think) running bash. Has anyone given this a shot?
Yep. My 2012 Tundra still runs well and looks good. The only issue is that over time the steering wheel has been polished smooth by all my driving over the years. I am at 160,000 miles and expect to get to at least 250,000.
> At a large insurance company I worked for, management set a target of 80% test coverage across our entire codebase. So people started writing stupid unit tests for getters and setters in Java DTOs to reach the goal.
I attended many TOC conferences in the 90s and early 2000s. Eli Goldratt was famous for saying "Tell me how you'll measure me, and I'll tell you how I will behave."
I was somewhat reminded of Larry Wall's newsreader, 'rn', as I read the description of Orange Site Hit. I invested a lot of hours on Nutnews back in the day.
The computer had 2048 words of erasable magnetic-core memory and 36,864 words of read-only core rope memory. Both had cycle times of 11.72 microseconds. The memory word length was 16 bits: 15 bits of data and one odd-parity bit. The CPU-internal 16-bit word format was 14 bits of data, one overflow bit, and one sign bit (ones' complement representation). [1]