Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why? Is decimal faster than arbitrary precision? If not, arbitrary precision should probably be the default (or binary).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: