Woodworking is, like, the quintessential craft. I think it is very useful to bring it in when discussion "craft"!
I am not myself a woodworker, however I have understood that part of what makes it "crafty" is that the woodworker reads grain, adjusts cuts, and accepts that each board is different.
We can try to contrast that to whatever Ikea does with wood and mass production of furniture. I would bet that variation in materials is "noise" that the mass production process is made to "reject" (be insensitive to / be robust to).
But could we imagine an automated woodworking system that takes into account material variation, like wood grain, not in an aggregate sense (like I'm painting Ikea to do), but in an individual sense? That system would be making judgements that are woodworker-like.
The craft lives on. The system is informed by the judgement of the woodworker, and the craftperson enters an apprenticeship role for the automation... perhaps...
Until you can do RL on the outcome of the furniture. But you still need craft in designing the reward function.
Thanks for the reference. I knew this product in the article sounded familiar:
“Literally a piece of e-waste in waiting, Lollipop Stars are suckers with an integrated battery and tiny speaker that, when placed in one's mouth, transmit sound through jaw vibrations, delivering what the brand calls ‘music you can taste.’”
One of the stated disadvantages was “the difficulty of designing adaptable action selection through highly distributed system of inhibition and suppression”
I can’t help but wonder how well a token-based AI could iteratively tune (or develop) a subsumption-based AI.
> Intel's first 32-bit microprocessor was the iAPX 432, which was introduced in 1981, but was not a commercial success. It had an advanced capability-based object-oriented architecture, but poor performance compared to contemporary architectures such as Intel's own 80286 (introduced 1982), which was almost four times as fast on typical benchmark tests.
I know that there was OO hype, but 1981 seems kind of early. I also know that OO means many, many things. What does it mean here, if anything?
Wikipedia has a basic overview [0]. Basically, they tried to do what Java did - there is no raw pointers in instruction set, but only special "access descriptors" that always point to a valid object. Microcode handles the rest, like garbage collection, and type checking.
> Each system object has a type field which is checked by microcode, such that a Port Object cannot be used where a Carrier Object is needed. User programs can define new object types...
If there are no microcode bugs, this should theoretically mean full safety and lack of unexpected behavior. But unlike Java with JIT, they were always checking every access - no wonder it turned out to be so slow...
Are the 50 for loops truly necessary in the manual C code example of a Kalman filter? At least introduce a few functions (that could be inlined and loop-fused) for some matrix operations?
I have a suspicion that Julia, owing to multiple dispatch, has a sort of regularity that makes that you said plausible.
Though there is just so much more Python to train on, any I bet they even do RL with validated rewards on Python, and probably not Julia.