You know how errors in destructors are transformed into warnings?
I was doing some comparisons various strategies for filling out arrays in MATLAB. I looked at the docs and noticed an implementation of a doubly linked list, and thought, why not include that and see how badly it performs?
Just pull the code for dlnode out of your own MATLAB help files:
edit ([docroot '/techdoc/matlab_oop/examples/@dlnode/dlnode.m']);
then copy and paste it right into a file named dlnode.m on your matlab path, and then try this:
tail = dlnode(0); for i = 1:510 new = dlnode(i); insertAfter(new, tail); tail = new; end clear tail; clear new;
The output from this overflows the command window buffer so I won’t post the whole thing. Here’s an excerpt:
In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 Warning: The following error was caught while executing 'dlnode' class destructor: Maximum recursion limit of 500 reached. Use set(0,'RecursionLimit',N) to change the limit. Be aware that exceeding your available stack space can crash MATLAB and/or your computer. > In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64 In dlnode>dlnode.delete at 64
….Yeah, not only is the error you get transformed into a warning, but what an error it was! Linked chains of handle objects pile up destructors on the stack and overflow if too large of a group of handle objects goes out of scope at once. And a goddamn stack overflow gets exception-funneled down into a warning!
Your computer has multiple gigabytes of memory. You can not make a linked list more than 500 items long without breaking MATLAB. Oh, wow.
Well, you can increase the recursion limit, but not far before you get hard crashes.
So how badly does the Mathworks’ own example of a linked list perform, anyway? Well, building a linked list N items long turns out to be between O(N^2) and O(N^3). Wowsers. (For the data-structures-impaired among you, it ought to be O(N).) And it’s factors of ten slower than the next slowest way of building up a list of indeterminate length.
These all — the absurd slowness of object property access, the recursive piling up of destructors on the stack, the exception-swallowing — these are all knock-on effects of the Mathworks’ insane notion that they can do automatic memory management without a garbage collector and guarantee deterministic destruction and support arbitrarily connected reference graphs. Well, a system built under these constraints can do at least one of three things: either it will have accomplished what in the history of computer science has never before been accomplished, or its time and space complexity is going to blow up, or it won’t actually work as advertised.
They might as well have assumed P=NP while they were at it.
UPDATE: It gets worse! 500 was the limit in R2010a, but as of R2011B, there’s twice as much recursion on finalization, so you can only make a list 250 items long!