Name: Anonymous 2013-09-07 16:38
When is it appropriate to use a high level programming language? Personally, I always feel guilty when using anything except C.
But just because the language prevents you from shooting yourself in the foot at a memory level, it doesn't mean you'll code any more securely. You can still make errors in the logic itself.Yes, but in a higher level language (at least in one that isn't shit), you can hope that the added expressiveness will eliminate redundancy, thus allowing you to better concentrate on the logic. Logic errors are also easier to notice than accidental off-by-ones or footnote-of-manpage errors.
Furthermore, the best way to maintain security is to reduce or avoid complexity. There isn't a fundamental flaw in their translation, and any exploitation of programs in such a language relies more on the Operating System itself. Compare that to Java and its VM.Not necessarily. JVM is a great example of how not to do anything, so bringing it up is pointless. As for complexity, most [s]Lisp[/s] anything interpreters out there are far far simpler than clang or, god forbid, gcc. My ideal platform would be one in which code is by default interpreted, and sometimes JITted into native code which is then checked by a static checker (which should be an easier problem than the usual halting-complete version since the JITC will only output a very specific subset of the possible ``native code'' programs).
Then test systems for that. Valgrind, cgdb, DynamoRIO, etc.They're imperfect. So far automatic theorem provers aren't strong enough to prove the correctness of sufficiently-large-to-be-useful programs, so they give up on exploring beyond certain branching points.
However, Dependently typed languagesThose are horrible and you know it.
When i write C i write serious, long-term stuff that should use 100% of the machine cycles.You need an optimizing compiler just to be able to use basic CPU instructions. It's more work for you and the compiler. It makes compilers more bloated and harder to maintain. Compilers are bigger so you can write more code to do the same thing you can do in other languages with a smaller compiler and less code. It's stupid for serious stuff, like using Brainfuck.
C allows you to write at minimal abstraction level, thats why it feels more natural, since its simple.You obviously never wrote any assembly. C is a terrible language for most machines because it has less features than the hardware. It's crap. C compilers have to turn whole loops into single instructions. Some of those optimizations depend on undefined behavior in C, which is well-defined in your CPU, so your code can turn into garbage.
Well, C is simple and has low-level features.It has pointers. That is the only low level thing about C. C is not low level, but crippled. You can't use arrays without resorting to unsafe pointers. That's why C can't have bounds checking without bloating up every pointer in the program. C pointers are less safe than assembly pointers because they are not machine addresses. All of those optimizations make pointers dangerous.
"Lisp compiles to C code that would be impractical to write manually"Someone made a Basic to Brainfuck compiler. That Brainfuck would be impractical to write manually. Wanting to compile to it so you don't have to use it means the language is bad, but it doesn't mean it's low level. Look at the JavaScript "transpilers" everyone is making so they don't have to use JavaScript.
C has the option of being at any level of abstraction, if programming time isn't a problem its possible to reimplement anything in C.Brainfuck has the option of being at any level of abstraction, if programming time and code size aren't problems, it's possible to reimplement anything in Brainfuck.
C is a terrible language for most machines because it has less features than the hardware.Yeah, I had to learn this the hard way when I wrote a toy bignum library. It turns out C has no way to access carries generated by multiplication, so if you want them, you have to either use a type twice the size or calculate the carry yourself using long multiplication, which incurs three extra multiplications. Division got a stdlib function that gives you quotient and remainder, but a defense against multiplication overflow wasn't worth anyone's time? Thanks to patterns like
ptr = malloc(width*length*height);
, you can easily get memory corruption from an overflow like that.run anywhere with no dependencies, and no runtime requirements.