Date: Fri, 10 Jun 2011 23:43:38 -0700
From: Matt Birkholz <matt@birk...>
The first implements alienate_float_environment() and calls it before
every callout and after every callback.
This is no surprise. If we hadn't been running with all exceptions
trapped since forever, I'd be worried that we really need to mask all
exceptions in primitives by default anyway.
The second patch implements BORKED_FENV.
That this is necessary is a bit surprising. Can you set a breakpoint
in gdb on alienate_float_environment, both with libc's fe* and with
Scheme's fe*, and step through the machine instructions to compare
what they do differently? (You'll have to avoid fesetenv altogether,
since Scheme doesn't implement FE_DFL_ENV.) I looked at the glibc
source code, and I see no substantial difference beyond fnstcw vs
fstcw in fedisableexcept (which shouldnt make a difference here).
Author: Matt Birkholz <matt@birk...>
Date: Wed Jun 1 20:54:13 2011 -0700
Oops. Too much garlic in my copy pasta -- *brain burp*. I guess the
only way to test this would be to allocate a stack large enough that
it doesn't fit in the low 4 GB of the virtual address space, which
would fall outside the domain of our automated testing facilities...