Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

From reading the code [1], both aren't needed; the `SIZEOFUINT64 = (uint64_t) ((uint64_t*) SELFIE_URL + 1) - (uint64_t) SELFIE_URL;` statement is the value that always ends up being used (once the library is initialized). An optimizing compiler would likely optimize the calculation away anyway. My guess is that it's predefined at the top simply for clarity and verbosity's sake (remember that this is built for educational purposes).

[1]: https://github.com/cksystemsteaching/selfie/blob/5de675a0f08...



It's really scary and I struggle to understand what is wrong with just

    const size_t SIZEOFUINT64 = sizeof (uint64_t);


I think it's because their subset of C [1], which they call C Star (C*), doesn't support the `sizeof` operator. Since Selfie is supposed to be able to compile itself, it seems they've restricted the used grammar to the subset supported by C*.

[1]: https://github.com/cksystemsteaching/selfie/blob/50b5fec8378...


C Star (C*) ?!?

Oh, well.

https://en.wikipedia.org/wiki/C*

=> "It was developed in 1987 as an alternative language to Lisp and CM-Fortran for the Connection Machine CM-2 and above. The language C* adds to C a "domain" data type and a selection statement for parallel execution in domains. "


Aah of course. Didn't read that far. Thanks.


Their compiler doesn't support sizeof. Maybe it's like that to ease bootstrapping?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: