Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I was curious how defer is implemented. `defer` in Go is famously function-scoped, not lexically-scoped. This means that the number of actively-deferred statements is unbounded, which implies heap allocation.

The answer is that Solod breaks with Go semantics here: it just makes defer block-scoped (and unavailable in for/if blocks, which I don't quite get).

https://github.com/solod-dev/solod/blob/main/doc/spec.md#def...

 help



What's the point if it's incompatible? The README suggests using go's testing toolchain and type checker, but that's unreliable if the compiled code has different behavior than the tested code. That's like testing and typechecking your code in a C++ compiler but then for production you run it through a C compiler.

Would have been a lot more useful if it tried to match the Go behavior and threw a compiler error if it couldn't, e.g. when you defer in a loop.

Is this just for people who prefer Go syntax over C syntax?


I don't work regularly on it but I have a proof of concept go to c++ compiler that try to get the exact same behaviour : https://github.com/Rokhan/gocpp

At the moment, it sort of work for simple one-file project with no dependencies if you don't mind there is no garbage collector. (it try to compile recursively library imports but linking logic is not implemented)


As long as you exclude defers in a loop, this can be done statically: count the maximum number of defers in a function, and add an array of that size + counter at the function entrance. That would make it a strict subset.

tbh I'd rather have this behaviour, defer should've been lexically scoped from the beginning.

> This means that the number of actively-deferred statements is unbounded, which implies heap allocation.

In C you can allocate dynamically on the stack using alloca or a VLA.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: