Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Satan is the father of lies

Name: Anonymous 2018-01-08 23:17

These C liars constantly claim that the work of other people came from C and Bell Labs hackers. Who would do that other than Satan and his servants?

They are so deluded by Satan that they believe the idea of variables and ASCII text came from C.

Okay, what y'all youngins need to understand is what programming looked like before C, and what it looked like with C.

What if, rather than manually typing in memory addresses, you could store stuff in variables that the rest of the program can access BY ENGLISH NAME? Or, when you need to, store memory addresses in variables, which you could still access by name? What if you automated all this, so the program itself allocated, read, and wrote memory addresses to the variables, so no programmer would ever need to do it themselves?

THIS is the level we're working at, OP. For some newage twirp to say it's a "fundamentally flawed language and has many errors" is just mentally painful. By errors he means "It doesn't do this thing we have space to run now, it not doing something that other things do is a FLAW and an ERROR"
We went from "Manually type in the address of the thing you want, or maybe offset stuff from an address if you really want, either way you're keeping a spreadsheet full of hex codes on your desk if you like it or not" to "Lol here it all is, named and everything, no need to know the address, you can simply copy the address of a variable to another variable and use the second to access the first, if something out of scope wants direct access to the first, just give it the second and it can do what the fuck it wants, go wild with math if you like but don't be a twat and TRY to keep track of where everything is"
Do you even begin to understand how much of a DREAM this shit was to programmers? It's all these halfwit newbies that think the compiler should keep everything safe for them no matter what that look at C and go "Now, this language so utterly pure and without clutter, it's FUNDAMENTALLY FLAWED and HAS MANY ERRORS because IT'S NOT FULL OF SHIT THAT DOES EVERYTHING FOR ME, IT DOESN'T GET IN THE WAY, IT DOESN'T STOP ME FROM DOING WHAT I WANT, IT EXPECTS ME TO KNOW WHAT I'M DOING, IT'S SO GARBAGE BECAUSE I'M TO FUCKING DUMB TO FOLLOW THE SIMPLEST RULEBOOK IN PROGRAMMING"

YES YOU HAVE TO KEEP A VAGUE IDEA OF HOW YOUR MEMORY WORKS, BUT C DOES SO BLOODY MUCH FOR YOU THAT ANYTHING MORE IT LITERALLY "BABBIE FIRST DRAG AND DROP HOCKEY GAME"
IT'S NOT EVEN MANAGING MEMORY, IT'S SITTING IN YOUR COMFY OFFICE TELLING THE PR GUY TO GET RID OF DEPARTMENTS YOU DON'T NEED ANYMORE, AND ADDRESSING EMPLOYEES BY THEIR BADGE NUMBER, NOT THEIR NAME, GENDER, WHAT OR WHO THEY ARE, JUST A SIMPLE NUMBER, AND PEOPLE STILL MANAGE TO SCREW IT ALL UP AND SAY "IT'S TOO HARD"

Name: Anonymous 2018-01-11 8:20

>>18
>had they put aside their ego and just used something "easy" like Lisp, Python, or Java, the entire computer security industry would have no reason for existing.

you are hilariously wrong. let me explain as someone who works in computer security: not all bugs are buffer overflows, and bugs that aren't buffer overflows cannot be prevented by using Lisp, Java or Python. low-level bugs that are eliminated by memory safe languages are commonly found either in performance-critical programs like multimedia codecs (so your garbage-collected languages are not the best choice here) or in operating system kernels, which need to be able to directly access memory so even if you wrote one in FIOC or lithp, you'd need a non-memory-safe superset, which would make many of the same errors possible (although some things would admittedly be less error-prone: Pascal-style strings as used in Java are harder to fuck up than null-terminated ones, and string processing is a huge source of C bugs).

meanwhile, FIOC and Java (Lisp not so much) are commonly used as backend languages in web applications and anyone with knowledge of web security knows that web application backends can be absurdly insecure. deserialization has huge exploitation potential: look up something on attacks based on ChainTransformer in Java and the security implications of pickle.loads() in Python. another common problem is using user-supplied data when constructing queries to different languages: SQL injection isn't just something that happens in PHP, and even more hilarity ensues when a web programmer passes arbitrary shit to Runtime.getRuntime().exec(). dynamic languages like FIOC also have good old eval(), which is also obviously easy to abuse.

Java is also the main language used in the Android operating system, and it's not like Android is some ultra-secure unhackable masterpiece of programming. and there are absolutely exploits that use its Java-based APIs and not just the underlying Linux OS - just read up on security of Android IPC. yes, most of it is about faulty logic and bad access control. that's what a lot of modern security is about.

and this brings us to my final point: most of the modern security is not actually about buffer overflows, cross-site scripting, race conditions, SQL injections or any other strictly programming-related errors. those things exist and they can be a source of major problems, but the biggest, most damaging and most high-profile attacks are less about that and more about policies, business logic and the good old social-engineering. why spend time searching for 0-days if you can phish passwords from employees because they are idiots and then access the whole network because the IT are idiots too?

PS. that's just scratching the surface. there's also crypto-related errors and hardware bugs. a good language won't fix bad math or bad electronics.

Name: Anonymous 2018-01-11 12:42

>>30
this doesn't change the fact that 'security industry' would still exist without C - because as I
demonstrated in my post, low-level vulnerabilities caused by the lack of memory safety are currently just a small subset of what security is about

Name: Anonymous 2018-01-11 15:09

>>42
you're pointing to software written in C (SQL implementations, deserialization libraries that link to compiled C, etc.) as evidence that the memory-safe languages are unsafe too.
but SQL injections have nothing to do with C or memory, they'd be the same regardless of language used for the interpreter. same goes for deserialization and OS command injection. the issue is that user-supplied data is being inserted raw into what is essentially code. you could maybe argue that this is a type safety issue but even the fancy typefag research languages haven't solved that problem yet (because they'd rather focus on academic shit than on anything practical).
if the programmer couldn't find/fix the error in his own code, what hope does a security consultant have?
from experience? a lot of hope. programmers usually think in terms of intended use and unintentional error, not in terms of malicious misuse.
aside from crypto and buffer overflows, security is mostly common sense.
interfacing between different high-level languages is not common sense (and that's the source of deserialization errors, SQLi, XSS). race conditions are not common sense. any sort of complex interaction with untrusted input is not common sense. communication protocols are not common sense. also, given how common the social engineering attacks are - even common sense isn't that common.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List