You know how errors in destructors are transformed into warnings?
I was doing some comparisons various strategies for filling out arrays in MATLAB. I looked at the docs and noticed an implementation of a doubly linked list, and thought, why not include that and see how badly it performs?
Just pull the code for dlnode out of your own MATLAB help files:
edit ([docroot '/techdoc/matlab_oop/examples/@dlnode/dlnode.m']);
then copy and paste it right into a file named dlnode.m on your matlab path, and then try this:
tail = dlnode(0); for i = 1:510 new = dlnode(i); insertAfter(new, tail); tail = new; end clear tail; clear new;
The output from this overflows the command window buffer so I won’t post the whole thing. Here’s an excerpt:
In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 Warning: The following error was caught while executing 'dlnode' class destructor: Maximum recursion limit of 500 reached. Use set(0,'RecursionLimit',N) to change the limit. Be aware that exceeding your available stack space can crash MATLAB and/or your computer. > In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64
….Yeah, not only is the error you get transformed into a warning, but what an error it was! Linked chains of handle objects pile up destructors on the stack and overflow if too large of a group of handle objects goes out of scope at once. And a goddamn stack overflow gets exception-funneled down into a warning!
Your computer has multiple gigabytes of memory. You can not make a linked list more than 500 items long without breaking MATLAB. Oh, wow.
Well, you can increase the recursion limit, but not far before you get hard crashes.
So how badly does the Mathworks’ own example of a linked list perform, anyway? Well, building a linked list N items long turns out to be between O(N^2) and O(N^3). Wowsers. (For the data-structures-impaired among you, it ought to be O(N).) And it’s factors of ten slower than the next slowest way of building up a list of indeterminate length.
These all — the absurd slowness of object property access, the recursive piling up of destructors on the stack, the exception-swallowing — these are all knock-on effects of the Mathworks’ insane notion that they can do automatic memory management without a garbage collector and guarantee deterministic destruction and support arbitrarily connected reference graphs. Well, a system built under these constraints can do at least one of three things: either it will have accomplished what in the history of computer science has never before been accomplished, or its time and space complexity is going to blow up, or it won’t actually work as advertised.
They might as well have assumed P=NP while they were at it.
UPDATE: It gets worse! 500 was the limit in R2010a, but as of R2011B, there’s twice as much recursion on finalization, so you can only make a list 250 items long!
Errors that happen during onCleanup are transformed into warnings? Really?
It doesn’t help that cleanup functions also can’t be closures — they can’t actually respond to data about a resource that was gathered during a program. But first things first.
Hey: if an error happens in my code, that is an error. My program should not continue unless it specifically handles that error. If an error happens while my program is trying to clean up after itself, that means something is wrong and MATLAB should not force my program to blithely continue and wreak further havoc.
I’ve restrained myself from picking too hard on The Mathworks’ decision to promise deterministic destruction for closures and objects, even though it has unacceptable performance penalties. The reason for the restraint is that I can see the argument for object lifecycle management: when your objects correspond to exclusive resources you hold, you do want to have control and guarantees over when they get released.
Well, I just looked into onCleanup and as usual, the Mathworks fucked it up: You cannot write robust programs using onCleanup, because exceptions during cleanup are swallowed.
So I’m going to have to start kicking at the thirty misfeature pileup where MATLAB’s memory management meets its error handling, after all.
There are a number of languages that offer both automatic memory management and exceptions. A few of them are Python, R, Java, and MATLAB. All of these except MATLAB deal with resource cleanup easily with a try/finally statement, which MATLAB lacks, and most also offer some extra sugar in the form of a try-with-resource, which MATLAB tries to do with deterministic destructors, and fails.
One of these things is not like the others:
propagating the original error), it can do so.”
One of these things is not like the others, and the one that fucked up is of course MATLAB.