Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Because 49 isn't just the "character code" for "1" it is _the actual value_.


You're talking low level representation now and not high level.


If you're going to insist that a character (or string) is it's own magic datatype _and is not an unsigned (or similar) underneath it all_ and you're going to talk about high level niceties, then your interpreter really really should not violate principle of least astonishment. There is no sane way to frame "1" - 1 _unless you explicitly typecast the string to a number_, because you now have to reconcile it with what should be identical behavior for numeric types, like "1"+(-1), which guess what, yields "1-1" in javascript, which is the definition of insane. You've also got to deal with other less obvious cases like when the string is a in a var and is not _always guaranteed to be a nice number_ etc, which really really makes ever using anything like that a code smell. It's far easier for both the programmer and the interpreter _not_ to play the guessing game, and try and implement inconsistent numeric behavior, than to just say "well I'm not going to do this unless you really insist (via an explicit cast) that you want this".


Why you're insisting in ignoring the fact that js is a loosely typed language and automatic type casting is in its DNA?


Having automatic type casting of the form we've seen above present in the language is like having a gun without a safety--it's that 1% of the time that the pin inadvertently strikes the shell that you will really really wish you had had a language that would have faulted rather than silently proceeding with broken logic and now potentially disastrously bad data. I can't believe that anyone would pick a language like this that would allow you (especially silently) to be this sloppy.


You can't reason with dogmatic traditionalists like you people.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: