Does the design of the language make them less (or more) likely to occur, though? e.g. all these things are possible in Forth too, but the design of the language definitely seems to conspire to make them less likely than C in my experience (I probably have about as many hours in each now? Roughly 1k, maybe 1.5k?).
(I suspect that this is the case in Forth because the stack gives a "linear-like feel" to most code; it's more obvious that you're accidentally not freeing something, because you need to explicitly discard it.)
The design doesn't do anything particularly different or cumbersome regarding memory and ownership, so I'll hazard a guess and say no, it doesn't make it less likely.
Does the design of the language make them less (or more) likely to occur, though? e.g. all these things are possible in Forth too, but the design of the language definitely seems to conspire to make them less likely than C in my experience (I probably have about as many hours in each now? Roughly 1k, maybe 1.5k?).
(I suspect that this is the case in Forth because the stack gives a "linear-like feel" to most code; it's more obvious that you're accidentally not freeing something, because you need to explicitly discard it.)