Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon.

Pages: 1-4041-8081-

How to C

Name: Anonymous 2016-01-08 14:23

https://matt.sh/howto-c

Very good read for anyone that wants to write this outdated language in 2016.

Name: Anonymous 2016-01-08 14:26

>>1
Developers routinely abuse char to mean "byte" even when they are doing unsigned byte manipulations. It's much cleaner to use uint8_t to mean single a unsigned-byte
How is "uint8_t" cleaner than a simple and elegant "char"? These faggots desperately need a renaming of all the standard types. In particular, getting rid of the "_t" bullshit is imperative.

Name: Anonymous 2016-01-08 15:06

>>2
Nobody guarantees you that char is unsigned, therefore you can get all sorts of undefined behavior that way.

Name: Anonymous 2016-01-08 15:32

>>3
use unsigned char

Name: Anonymous 2016-01-08 15:41

>>4
B (Byte) 1 char
u1 (void.h) 2 chars
uint8_t (stdint.h) 7 chars
unsigned char 13 chars
Making life harder for yourself?

Name: Anonymous 2016-01-08 15:46

Unsigned:signed
1c:B Byte C Char
2c:W Word S Short
4c:D Dword I Int
8c:Q Qword L Long
There is no reason to use anything longer besides masochism

Name: Anonymous 2016-01-08 17:17

>>4-5
Go back to your dead BBS, FrozenAnus.

Name: Anonymous 2016-01-08 17:31

Why did the jews add _t to everything?

Name: Anonymous 2016-01-08 17:51

>>8
_type=_t

Name: Anonymous 2016-01-08 18:33

>>9
But it's already obvious that it's a type from its syntactic position. No other language needs "_t" in type names.

Name: Anonymous 2016-01-08 19:39

>>6
Unsigned:signed:float
1c:B Byte C Char H Half-precision(16bit)
2c:W Word S Short F Float(32bit Single-Precision)
4c:D Dword I Int T Double(64bit Twofold-Precision)
8c:Q Qword L Long E Long Double(80 bit Extended Precision)
V:void

Name: Anonymous 2016-01-08 21:45

>>10
But my Hungarians!

Name: Anonymous 2016-01-08 21:49

>>>/hackernews/

Name: Anonymous 2016-01-08 22:04

This article reads like those Java spoofs on "Hello, World!" which gets longer and longer as "abstractions" and "best practices" are applied.

Serious Poe's law shit going on here.

Name: Anonymous 2016-01-08 22:19

>>14
He does have one sound piece of advice:

So, do NOT do this:

#ifndef PROJECT_HEADERNAME
#define PROJECT_HEADERNAME
.
.
.
#endif /* PROJECT_HEADERNAME */

Do THIS instead:

#pragma once

Name: Anonymous 2016-01-09 7:00

I usually get an anger boner when I read something like "Modern XYZ" or "XYZ in 20XX" -- implying the "old" (and often proven) way of doing something has suddenly become obsolete.
As far as I can tell, there's 2 different reasons for such opinions to occur:
1. The model of something has changed, like an API, which is only sometimes changed for good reason (e.g. 3D APIs, server backend stuff, etc).
2. Someone wants to sell you an opinion ('sell', as in book, seminar or vendor-lock-in).
This one though, just seems to be someone (partially) uninformed. Especially the stdint.h promotion seems to be baseless and useless: there's solid reasons for using char/int/unsigned (also, they wouldn't be in the standard anymore if they were 'obsolete'/dangerous, like gets()).
I do have to agree with him on size_t (although I try to avoid it) and -Wall -Wextra, though.

Name: Anonymous 2016-01-09 7:48

>>15
That's non-standard C and should thus not be used.

Name: Anonymous 2016-01-09 8:24

Use your i's

Name: Anonymous 2016-01-09 9:59

https://matt.sh/howto-c
It's an accident of history for C to have "brace optional" single statements after loop constructs and conditionals. It is inexcusable to write modern code without braces enforced on every loop and every conditional. Trying to argue "but, the compiler accepts it!" has nothing to do with the readabiltiy, maintainability, understandability, or skimability of code. You aren't programming to please your compiler, you are programming to please future people who have to maintain your current brain state years after everybody has forgotten why anything exists in the first place.

Name: Anonymous 2016-01-09 12:50

>>16
solid reasons for using char/int/unsigned
What are they?
char is always obviously u8
int is always at least 16 bits (so use uint_fast16_t for loops) and you'd be insane to say you wanted different behaviour on 32/64 bit platforms
unsigned should be default because it is far more likely what you want, and you should "opt-in" to the slightly different behaviour of some signed operations, e.g. mod/div/rem/quotient

Name: Anonymous 2016-01-09 13:55

>>20
char is always obviously u8
No it's not, the signedness of char is implementation defined.

Name: Anonymous 2016-01-09 14:24

>>21
And how is that a solid reason for using char?

Name: Anonymous 2016-01-09 15:35

>>20-21
char is also not always 8 bit, but always represents one byte on the given architecture (even if it's 9bit or something else exotic)

Name: Anonymous 2016-01-09 19:32

>>23
A byte is always 8 bits. A char is the smallest addressable word.

Name: Anonymous 2016-01-09 21:03

>>23,24
As per IEC 80000-13 in fact.

Name: Anonymous 2016-01-09 22:15

>>24-25
char is CHAR_BIT, C standard requires it to be at least 8 bit; POSIX mandates CHAR_BIT == 8

Many Texas Instruments DSPs have 16 or 32 bit chars; machines with 36 bit architectures, e.g. DEC PDP-6/10, could have 5, 6, 7, 8 or 9 bits per character (+ 4, 0, 1, 4 or 0 spare bits) and due to C standard requirement of at least 8 bits per char, they implemented 9 bit chars.

Name: Anonymous 2016-01-09 22:27

>>26
Machine-dependent data type width is unportable. char and int are thus shit.

Name: Anonymous 2016-01-09 22:57

>>26
POSIX mandates CHAR_BIT == 8
Only POSIX 2008 and maybe 2003

>>24
Are you claiming that PDP10 had 8-bit bytes?

>>25
Prove it

Name: Anonymous 2016-01-10 4:18

>>28
"In English, the name byte, symbol B, is used as a synonym for octet. Here byte means an eight-bit byte. However, byte has been used for numbers of bits other than eight. To avoid the risk of confusion, it is strongly recommended that the name byte and the symbol B be used only for eight-bit bytes."

Name: Anonymous 2016-01-10 6:31

>>29
This proves nothing.

Name: Anonymous 2016-01-10 7:53

>>30
Sometimes a cigar is a penis. But in the other nine hundred ninety nine million nine hundred ninety nine thousand nine hundred ninety nine out of one billion times, it is just a cigar. It is then usually a safe bet to call a cigar a cigar, and not quibble over the slight chance you may be cutting a penis open when you roll that blunt.

Name: Anonymous 2016-01-10 8:01

>>31
Except that this is totally irrelevant to what >>24,25-chans claim.

Name: Anonymous 2016-01-10 10:40

ATTENTION! This thread is derailed. Please respond to >>27 in order to continue pertinent discussion.

>>28
I cited a standard. Just look it up, you goit.

Name: Anonymous 2016-01-10 10:48

What the shit is everyone talking about?!?
A char is always:
able to hold a decimal value from 0 to 127
able to hold 8 binary digits
able to hold an ASCII character (ANSI, too, but that's a little more complicated when CHAR_BIT isn't 8)
of size 1
And if you don't see the implications of this, then just continue using uint8_t.

Name: Anonymous 2016-01-10 11:16

>>27
If the minimum range of int is guaranteed sufficient for your purposes, then letting the system use its default register width is ideal.

Name: Anonymous 2016-01-10 12:37

>>29
We are talking about C programming language, not English language.

>>34 and to some extent >>27
Not all platforms implement uint8_t, for example 16 bit TI DSPs. Because of that, char is much more portable type, because it means smallest addressable type on given architecture (except when architecture allows smaller chars than 8 – the lower limit of C standard). And as >>34-kun stated, sizeof(char) is always 1. Types whose size is not a multiply of char have no right to exist on such architectures in a native way.

Name: Anonymous 2016-01-10 14:45

>>35
As int is at least 16 bits wide, it is more ideal in that situation to use uint_fast16_t or uint_least16_t depending on whether you are iterating or storing in an array.

>>36
Not all platforms implement uint8_t
OK: http://www.cplusplus.com/reference/cstdint/
However, UINT8_MAX is defined to be 255, so uint8_t (or whichever of uint_fast8_t or uint_least8_t is chosen if it does not exist) is still more useful than char. Refer back to >>27.

Name: Anonymous 2016-01-10 17:10

the stdint.h on mingw introduces a new exciting
type that force cludges for anyone attempting
to use it. Source:
typedef unsigned uint32_t; //line 40
this typedef requires 2 extra types
(long,unsigned long)
to be handled as special cases and
typedef signed char int8_t;//line 35
requires handling an extra(char) type which is considered
distinct from signed char.

Name: Anonymous 2016-01-10 20:33

>>38
You need a wider terminal mate.

Name: Anonymous 2016-01-10 21:05

Can /prog// write a better "How to C"?

Name: Anonymous 2016-01-10 21:57

>>40

Sure:

Unless you're writing something which directly interfaces with the hardware, don't.

Even if you are interfacing with hardware, consider any other language with inline asm.

Name: Anonymous 2016-01-10 22:06

>>40
Don't be a retard. If you are a retard, don't program at all.

>>41
You are probably a retarded code monkey.

Name: Anonymous 2016-01-10 22:13

check 'em

Name: Anonymous 2016-01-10 22:14

>>42
As >>41, I am a big believer in teaching the machine to measure and figure out out how to make things fast, not require the human to hand-hold the CPU through its every step. The latter is the human working for the machine, which is backwards.

Name: Anonymous 2016-01-10 23:14

>>33
It's simply a recommendation, not a definition.

Name: Anonymous 2016-01-12 9:47

>>40
How to C in 2016:
Read K&R and C89.

Name: Anonymous 2016-01-12 10:49

>>43
nice

>>1
You should write Rust instead.

Name: L. A. Calculus !jYCj6s4P.g 2016-01-12 22:11

>>1
SHUT UP YA FUCKIN YUPPY. IF U WROTE DAT ARTICLE UR A BIT PUSHIN STACK BOI RETOID DAT YUPPS ON HIS OWN COCK.

>>2
OR JUST DONT INCLUDE FUCKIN <stdint.h>

>>8
ITS DA FAT CATS AT DA FUCKIN ISO DAT DID IT, DA SAME ONES DAT HAVE PICKED AWAY N CONTINUE TO PICK AWAY AT DA LANGUAGE DAT DEANIS RICKY LOVED TURNIN IT TO A HUNK OF CORPORATE CRAP.

DA STAK BOI DAT WROTE DA ARTICLE'S JUST ONE OF DER FUCKIN PAWNS. DOESNT KNO DA FIRST THING ABOUT C, JUST HAS HIS HED UP HIS BOSS'S BUTT N HIS BOSS SAID SOME SHIT IN PASSIN ABOUT C SO HE THOT HED CHECK IT OUT N HERE'S DA CRAP HE CAME OUT WITH. NOTHIN OUT OF DA ORDINARY RLY. IF U DIDNT FIND C URSELF, ON UR OWN TERMS, U SHUD SHUT UR FUCKIN MOUTH N WRITE ABOUT SOMETHIN U ACTUALLY UNDERSTAND

>>15

I HAV A BETTER IDEA: DONT DO NESTED INCLUDES, YA FUCKIN RETOID. THEN U DONT NEED EITHER ONE.

>>16

I do have to agree with him on size_t (although I try to avoid it) and -Wall -Wextra, though.

EXCEPT HE THINKS U NEED TO USE ITALL THE FUCKIN TIME BECAUSE HE TIHNKS U NEED TO BE ABLE TO STORE THE SIZE OF EVERY POSSIBLE FUCKIN OBJECT ALL THE FUCKIN TIME WHEN int WILL PROBABLY DO DA JOB WELL AND WITH NO FUCKIN ISSUES FOR REPRESENTIN DA SIZE OF MOST OBJECTS.

N WAT DA FUCK'S WITH HIS COMMENT DAT sizeof RETURNS SOMETHIN? IT'S A FUCKIN OPERATOR. I KNO HIS STACK BOY ASS THINKS sizeof(int) IS A FUNCTION CALL, BUT I GOT NEWS FOR U, U BIT PUSHERS: ITS TWO OPERATORS: sizeof FOLLOWED BY A CAST. YAINT RED DA STANDARD, YAINT WRITTEN A DECENT LINE OF C, AND YAINT ABLE TO DISCUSS C INTELLIGENTLY. GO BAK TO UR FUCKIN NEW AGE YUPPY LANGUAGES

>>19

UR BIRTH'S AN ACCIDENT OF HISTORY, YA FUCKIN STACK BOI RETOID

Name: Anonymous 2016-01-12 22:15

>>48

Please write a C tutorial, LAC.
With lots of examples and shit.
Maybe we can even publish it.
Those "publish your own book" allow you to print 100 copies for around $2 each.

Name: L. A. Calculus !jYCj6s4P.g 2016-01-12 23:07

I KNO A GOOD GUIDE TO C, ITS CALLED KERNYAN N RICKY 2

DO DEY USE HEADER GUARDS IN K&R2? NO:

calc.h:
#define NUMBER '0'
void push(double);
double pop(void);
int getop(char []);
int getch(void);
void ungetch(int);


(DEY TEACH EM THO, BUT GUESS WAT? DER'S NO FUCKIN #pragma BULLSHIT)

HOW ABOUT FIXED INTEGER TYPES N <stdint.h>? NOWHERE TO BE FOUND.

AND sizeof?

The sizeof operator yields the number of bytes required to store an object of the type of its operand. The operand is either an expression, which is not evaluated, or a parenthesized type name.

OPERATOR. NOT A FUCKIN FUNCTION.

WAT ABOUT EXCESSIVE USE OF size_t IN K&R2? HELL NO:

int getline(char line[], int max);

int strlen(char *s);

ARE DEY WRITING getline AND strlen FOR ANYTHIN OTHER THAN THEIR OWN PERSONAL USE HERE? NO. SO WHY DA FUCK IS size_t IMPORTANT IF U ALREADY KNOW THAT int CAN STORE DA SIZES OF EVERYTHIN UR WORKIN WITH? IF UR WRITIN A LIBRARY AND U HAVE NO FUCKIN IDEA HOW SOMEONE'S GONNA CALL UR FUNCTION, THEN YEA, size_t'S DA TYPE TO USE THERE COS U DON'T KNOW WHAT'S GOIN INTO UR FUNCTIONS AND U WANT TO HANDLE AS MANY INPUTS AS POSSIBLE. BUT OTHERWISE UR JUST OVERENGINEERIN SHIT, AND IF U WANT TO OVERENGINEER SHIT, U SHOULD BE USIN C++ OR JAVA AND WRITIN A CLASS FOR EVERY LITTLE FUCKIN THING.

IN CONCLUSION, BIT PUSHER MATT'S POSITION ON C, WITH HIS HIGHLY SOFISTICATED TYPE NAMES, AMOUNTS TO: "I'M TOO STOOPID TO READ DA STANDARD (DAT'S Y I THINK sizeof IS A FUNCTION!), TOO STOOPID READ ABOUT DA RANGES OF STANDARD TYPES (DAT'S Y I USE FIXED WIDTH INTEGER TYPEZ!), N TOO STOOPID TO THINK ABOUT MY PROGRAM'S INPUT RANGES (DAT'S Y I USE size_t EVERYWHERE! IT HANDLEZ EVERYTHIN!)

THANKS FOR DA ARTICLE, BIT PUSHER MATT, UR A REAL PIECE OF WORK. I HOPE DIS ARTICLE GETS U WHAT U WERE SEEKING ALL ALONG: A BIG FAT SLOPPY KISS FROM UR BOSS IN DA BIG MONDAY MEETING, AND A STICKER U CAN WEAR ON UR SHIRT DAT SAYS "C EXPERT", SO U CAN PARADE AROUND FOR DA REST OF DA WEEK PORTRAYING A SENSE OF SUPERIORITY OVER UR FELLOW COFFEE-GUZZLING RETOIDS

NOW GET DA FUCK OUTTA MY THRED, YA CORPORATE CRAPAZOID.

Name: Anonymous 2016-01-16 3:02

I love C. It's really fun, like a puzzle, to come up with ways to make do shit with such an un-expressive language.

Name: Anonymous 2016-01-16 14:40

Name: Anonymous 2016-01-20 22:55

>>41

Look at this horse's ass. In the real world, you always write in C.

Name: Anonymous 2016-01-21 0:12

>>53
Have fun living in the 70s.

Name: Anonymous 2016-01-21 2:03

>>54
Why do you kids always think that 'new = better'?
C is much more useful and efficient than some ``modern'' languages out there.

Name: Anonymous 2016-01-21 2:35

>>55
Bullshit. C is not useful, and not more efficient. Every major C project I've seen "In The Real World", aka in actual use in companies, is bloated to shit with string libraries, bad attempts at OO, bad attempts at dynamic dispatch, all because the language itself doesn't do jack shit and is fucking useless.

There is no efficiency or usefulness at all in C in the Real World™.

Name: Anonymous 2016-01-21 2:50

>>55
Programmer time is far more expensive than computer time. In the real world in most applications, it is not necessary for a programmer to manually manage computer resources to solve most software problems. It is normally a more efficient use of the programmer time for them to use a high level language that allows the programmer to concentrate on the business logic with minimal regard to machine logic.

Name: Anonymous 2016-01-21 5:14

>>55
So tell me, in which language is your kernel written in? Your windowing system? Your file manager? Your web-browser?

In the end, pretty much everything is in C.

How is it not useful?

Your's you Haskell (or whatever ``modern'' language you advocate) operating system?

Name: Anonymous 2016-01-21 7:10

>>58
Kernel, fine. But do you see how much bullshit all these massive file manager, window manager, and web browser projects have to put up with? And again, they all have their Greenspun bloat of janky dynamic infrastructure cobbled together because it's impossible to write such large, dynamic programs in straightforward C. It always becomes a beyond-C abomination.

All of these projects would get a ton shorter, simpler, and yes FUCKING FASTER if they wrote it in a good fast compiled dynamic language, Common Lisp being my choice.

If you're writing a 100-line C program, or a straightforward data conversion library, sure, whatever, you won't hit too many problems. If you're writing a million-line monster that isn't the kernel & drivers, C is fucking stupid, a waste of time, and all the shit you pile on top of it to try to implement your dynamic runtime user features slow the fuck out of it.

Everything is machine code. C is just one way to get there, and is an extremely shitty way for large dynamic applications. Enough compilers and JITs generate their own byte-level machine code without going through C.

Name: Anonymous 2016-01-21 7:14

>>59
Oh, and do I even need to mention pointer level security, especially when piling on the dynamic infrastructure shit on top of C? It's unmanageable, with billions of unknown vulnerabilities that continually creep out, because it's impossible to manage the dynamic state from the static view of the world that C imposes. Fuck that noise.

Name: Anonymous 2016-01-21 7:35

>>59
1.Lisp depends on C or generates C code(with exception of some exotics like https://github.com/benthor/mongoose)
2.Lisp is much slower and takes more memory. GC,eval and List structure aren't cheap.
3.Lisp can be replaced by adding a tokenizer-eval macro feature to C preprocessor see
http://w11.zetaboards.com/frozenbbs/topic/11501531/
Which would eliminate any advantage of using Lisp as C preprocessor if added to C standard.

Name: Anonymous 2016-01-21 8:01

>>61
Fuck you, and fuck your ignorance.

Lisp has its own assemblers and compilers. The bootstrap from C could happen in any language.

Large C bullshit does fucking reference counting and shit-tons of copies. GC is faster in large apps.

Tokenizer eval shit is shit. Flipping around cons cells does not even scratch the surface of a powerful, fast, dynamic language.

You're a cave man of computing. Stop waving your shit around people who know how to drive a computer.

Name: Anonymous 2016-01-21 8:15

>>62
Stop waving your shit around people who know how to drive a computer.
World doesn't run on elegant abstractions. Its runs in C/C++ and this wouldn't change for awhile(and C/C++ standards guarantee it would be backward compatible).
You fear that adding tokenizer eval will make your Lisp irrelevant? Guess what: its already implemented in some third-party preprocessors, its just isn't part of the C standard.

Name: Anonymous 2016-01-21 8:26

>>63
Tokenizer cons cells are literally a toy. How are you going to networking/graphics/sound/input/etc in the preprocessor? Do you seriously believe that's in the future for the preprocessor of a portable assembly code?

The only reason the world runs on C is because of that portable assembly language use case, which has a lot of momentum in writing OSes. But then they expose the OS interface only as a C interface, nearly requiring that their applications be written in their systems language, which is fucking backwards and everybody knows it. It's that interface that keeps everything clung to C.

But the web world is far beyond C & C++. The ENTERPRISE world loves it some Cobol, Java, and whatnot. Devops is python, perl, ruby, shell scripts. Supercomputing is Fortran. So you've got games, UI libs which are still stuck onto that C/OS interface shit, and basic OS utilities still being written in C. But even the utility layer is being taken over by other languages.

Name: Anonymous 2016-01-21 8:38

>>64
Tokenizer cons cells are literally a toy. How are you going to networking/graphics/sound/input/etc in the preprocessor?
Tokenizer-eval isn't used alone:
it produces C code which is run and then uses "networking/graphics/sound/input/etc" as standard C code does(by using libraries/functions/etc).
Do you seriously believe that's in the future for the preprocessor of a portable assembly code?
Yes, tokenizer-eval makes C preprocessor more powerful than Lisp(though a bit more verbose, will require a macro library like void.h to fully utilize). You underestimate what could be done with it, since you're used to "elegant and terse" abstractions that come with Lisp.
The ENTERPRISE world loves it some Cobol, Java, and whatnot. Devops is python, perl, ruby, shell scripts. Supercomputing is Fortran.
Written in C/C++ or uses C infrastructure.

Name: Anonymous 2016-01-21 8:45

>>65
You underestimate what could be done with it, since you're used to "elegant and terse" abstractions that come with Lisp.
You don't even know what "elegant and terse" means, and I have no idea what you think "more powerful than Lisp" means, because you clearly know nothing of the language at all. It won't be faster, it won't be safer, it won't be more expressive. Maybe it could have a smaller footprint? That's about it.

Written in C/C++ or uses C infrastructure.
Your attempt at a meme is the whining of a child and needs to die. Because it's an artifact that has disappeared behind the scenes, it can be replaced with anything. It is a commodity of zero value in that position. Your insistence on bringing it up reinforces its lack of value.

Name: Anonymous 2016-01-21 9:05

>>66
"more powerful than Lisp" means, because you clearly know nothing of the language at all.
C macros are lexer-level: they allow literally everything.
With tokenizer-eval it also adds the constexpr-like ability to macros - allowing lexer-stage evaluation/rewriting of tokens.
Lisp lacks the above or uses convoluted expressions to partially emulate a simple C macro.
Maybe it could have a smaller footprint? That's about it.
Zero-overhead principle: if something is unused it costs nothing. _TokenOf() is like having a extremely compact eval() which feeds back into C preprocessor. The key reason C won its doesn't impose costs before use.
Your insistence on bringing it up reinforces its lack of value.
Its brought up because delusional users of toy language often proudly claim independence from C while using software which depends on it. When Lisp OSes/Games make a comeback you could claim lisp is actually powerful/useful and not a shitty script.
After all the abstract bullshit in Lisp hasn't helped it one bit to compete with any other language.
Where are the Lisp webservers,text/graphics/sound editors, operations systems,games today?

Name: Anonymous 2016-01-21 9:28

>>67
1) Lisp has full control (ie, controlled by plain Lisp functions) of how characters are converted to tokens, and how tokens are converted or manipulated before hitting the compiler. It is a real programming language manipulating tokens as real data structures, not a bunch of shell-like string hacks. You know nothing of what Lisp is or does, and are making a complete jackass of yourself by attempting to hold little pieces of shit above it.

2) Every Lisp compiler already does constant folding, dead code elmination, and all that. If something is unused, it costs nothing. And in a much better implementation than you'll pull off in preprocessor expansion hacks. Will cpp even have a notion of type inference? (Again, you have no fucking clue what Lisp is or what it does.)

3) If I write something in Pascal, my code has no dependence on C. You are too fucking stupid to understand this concept. Why do I even bother with your fucking willful ignorance, but here we go, for the benefit of others who don't have your mental impairments who might be interested.

There are Forth machines. There are Lisp machines. There are Oberon machines. There are assembly-based embedded machines. None of these have C in their toolchain. They are either bootstrapped from assembly languages or from their own non-C language. I can port the Pascal environment to it, and run my Pascal there. The Pascal defines the execution of my program in only Pascal terms. I can run it on a Pascal compiler/interpreter written in C, written in Assembly, written in Forth, or whatever. I don't care. Windows, Linux, and Mac are chasing after the Unix way, and have planted themselves on C. That is a purely arbitrary decision, and once you're one layer removed from that decision, C is 100% irrelevant. I can write a simple stack machine in any machine language, and high level language functionality is instantly bootstrapped and wide open to me.

The fact that C is somewhere buried in the chain, and generally stays buried there once nicer programming constructs are built on top of it shows that it is a dying language existing only on its own momentum, not any sort of benefit.

Lisp has mature in-use webservers, text editors (duh), graphics/layout editors, not sure about sound editors but I know there are music programs, the Symbolics OS is getting an emulation based comeback with many inspired projects, and Lisp game jams are a thing. Lisp-style languages are growing in popularity, while C is declining. Besides, popularity is politics, money and inertia. Powerful & useful are generally not in the limelight, no matter the industry; the McDonald's products are what's most popular.

And it's not about "abstract bullshit" (another clue that you're fucking clueless about Lisp), it's about productivity, flexibility, and directness of expression. In C you always have to build up a ton of abstract bullshit infrastructure to make anything work, so you're eating bowls of dicks while calling everybody else homo, you clueless shit-spewing hypocrite.

Name: Anonymous 2016-01-21 10:08

1) Lisp has full control (ie, controlled by plain Lisp functions) of how characters are converted to tokens, and how tokens are converted or manipulated before hitting the compiler. It is a real programming language manipulating tokens as real data structures, not a bunch of shell-like string hacks. You know nothing of what Lisp is or does, and are making a complete jackass of yourself by attempting to hold little pieces of shit above it.
-------
Lisp has macro hygiene and syntax rules. C preprocessor doesn't: its more powerful though unsafe to manipulate "a bunch of shell-like string hacks" and even more powerful with _TokenOf() evaluation.

Will cpp even have a notion of type inference? (Again, you have no fucking clue what Lisp is or what it does.)
------
typeof() and __auto_type (provided by GCC)
https://gcc.gnu.org/onlinedocs/gcc/Typeof.html

As for overloading on typeexpression, its called _Generic(used extensively in void.h) and GCC-specific types_compatible_p https://gcc.gnu.org/onlinedocs/gcc/Other-Builtins.html (see choose_expr/types_compatible_p example)

3) If I write something in Pascal, my code has no dependence on C.
-----
FreePascal only(its a quite different from enterprisey pascals of the past and cross-platform). They did it by manually adding huge amounts of arch-specific assembler includes. Also, writing Lisp compilers in it would be much harder due pascal BDSM-like rules:you'll be forced to use assembler to workaround it.

There are Forth machines. There are Lisp machines. There are Oberon machines. There are assembly-based embedded machines. None of these have C in their toolchain.
----
And they are niche or legacy products that lost the competition with C-based solutions. If the reverse was true you would be posting on LISP/Forth/Oberon machine.

shows that it is a dying language existing only on its own momentum, not any sort of benefit.
Well, github shows its one of most popular languages. Millions of lines in it are written daily. Pretty strong "momentum" for a dead language.

Lisp-style languages are growing in popularity, while C is declining.
----
Lisp-like? Does this means JavaScript/Scala? Does Java itself(got lambdas recently) and C++(C++11/14 added lots of functional stuff) fit the defintion?
Could you list github repositories in Lisp or Scheme(only Lisp/Scheme not JavaScript),instead of spewing meaningless claims?

Besides, popularity is politics, money and inertia.
Are you backpedaling on Lisp popularity?

And it's not about "abstract bullshit" (another clue that you're fucking clueless about Lisp), it's about productivity, flexibility, and directness of expression.
Abstract bullshit violates Zero-Overhead principles by costing performance/memory without explicit benefit to programmer:its already understood by JavaScript community and been key to improving performance while preserving productivity, unlike Lisp where performance is seen as secondary and productivity is hampered by crippled syntax(ironically slower and less expressive than JavaScript).

Name: Anonymous 2016-01-21 10:27

Lisp has macro hygiene and syntax rules.
No it doesn't, you fuck.

typeof() and __auto_type (provided by GCC)
HAHAHAHAHAHAH, eat a didk, and learn what the fuck a compiler does, you little baby. Why the fuck are you even here if you don't even know the basic shit that's even inside your beloved C compiler?

FreePascal only(its a quite different from enterprisey pascals of the past and cross-platform).
I'm talking about the goddamn language, you fuck. You have no fucking clue about anything.

If the reverse was true you would be posting on LISP/Forth/Oberon machine.
The common platforms have been commodotized. I do use virtualizations of those machines which give me a full capable environment which can ignore the rest. You fuck.

Pretty strong "momentum" for a dead language.
I didn't say "dead" you fuck, I said it's dying. Now you're just waffling trying to defend yourself.

Lisp-like?
Clojure is the biggest "core" Lisp rising in the actual job market. Ever wonder about those languages, including C++, that are constantly playing catch-up on ancient Lisp features? Do you think that means anything you piece of shit?

Are you backpedaling on Lisp popularity?
I never said it is popular, you fuck. Stop twatwaffling. However, it was king of the hill back in the Lisp machine days, and leading up to it (which is why there was a commercial market for Lisp machines in the first place).

Abstract bullshit violates Zero-Overhead principles by costing performance/memory without explicit benefit to programmer:its already understood by JavaScript community and been key to improving performance while preserving productivity, unlike Lisp where performance is seen as secondary and productivity is hampered by crippled syntax(ironically slower and less expressive than JavaScript).
This is a quote for the ages. Never has somebody tried to talk so authoritatively about language technology, while being so demonstrably ignorant about everything within it.

You don't even know what "abstract bullshit" is or what it does. Lisp is one of the fastest languages on the planet. Even JavaScript is as fast as C nowadays in terms of the inner loop numeric processing, which is all that C fucks ever measure (because in anything regarding dynamic behavior C will fall on its fucking face as everybody blows past it).

unlike Lisp where performance is seen as secondary and productivity is hampered by crippled syntax(ironically slower and less expressive than JavaScript
It's clearly established that you know absolutely zero about the language and are still spewing falsehoods like you think you have any clue to do so.

Seriously, you really need to kill yourself. You are a worthless human being. Your brain has stopped functioning decades ago, if you're even that old. If you're a kid who's just learning this stuff (because clearly you don't even know C compilers, let alone Lisp), kill yourself anyway for being such a little fuck online.

I did C for a decade, and Lisp for about the same. You've dabbled in C and don't know jack shit about Lisp. Why the fuck are you even in this conversation? Die.

Name: Anonymous 2016-01-21 10:38

You know what, here's a challenge regarding syntax and macros, since you obviously can't read, learn, or state anything sensible in actual conversation:

Make compile-time roman numeral literals in C.

Prefix them with some single escape character for clarity. Say int i = `MCLI; should work to set i to 1151. To make it more straightforward, just use additive order-independent numerals, so `ILCM would still be 1151.

Lisp can certainly do that easily with its reader macros. How about C? How about even C with the hypothetical extensions you say are more powerful than Lisp?

Name: Anonymous 2016-01-21 10:45

(fuck, I meant preprocessor-time, not compile-time. I'd be curious to see either, but kudos only to the former.)

Name: Anonymous 2016-01-21 11:08

Lisp has macro hygiene and syntax rules.

No it doesn't, you fuck.
-------
Apparently it does since translating simple C macros is
let me quote "Regarding that C macro, I'm not even going to bother trying to parse that garble. At first glace, though, I will say it's generally bad form for macros to blindly insert local variable names into whatever it's generating, partially because in Lisp they're not just strings but symbols (which are interned & namespaced, and the macro & usage can be in different namespaces). The first step in converting it to Lisp would be to untangle that mess into a more sane expression."

typeof() and __auto_type (provided by GCC)

HAHAHAHAHAHAH, eat a didk, and learn what the fuck a compiler does, you little baby. Why the fuck are you even here if you don't even know the basic shit that's even inside your beloved C compiler?
-----
It replaces type-inference along with _Generic(replaces C++ ADL type-to-function resolution). Its not a perfect solution(C++ auto/decltype(auto)/decltype provide a cleaner syntax) but it works.

FreePascal only(its a quite different from enterprisey pascals of the past and cross-platform).

I'm talking about the goddamn language, you fuck. You have no fucking clue about anything.
-----
No other Pascal dialect is C/C++ independent.

If the reverse was true you would be posting on LISP/Forth/Oberon machine.

The common platforms have been commodotized. I do use virtualizations of those machines which give me a full capable environment which can ignore the rest. You fuck.
-----
Only in a contrived, virtual environment written in C/C++/asm.

Pretty strong "momentum" for a dead language.

I didn't say "dead" you fuck, I said it's dying. Now you're just waffling trying to defend yourself.
-------
it patently obvious C and C++ have no signs of dying or being replaced.

Lisp-like?

Clojure is the biggest "core" Lisp rising in the actual job market. Ever wonder about those languages, including C++, that are constantly playing catch-up on ancient Lisp features?
-----
Clojure seems popular, but i don't see any serious project in it and its Java-based. Could you link a few repositories?


Are you backpedaling on Lisp popularity?

I never said it is popular, you fuck. Stop twatwaffling.
---------------
>>68
"Lisp has mature in-use webservers, text editors (duh), graphics/layout editors, not sure about sound editors but I know there are music programs, the Symbolics OS is getting an emulation based comeback with many inspired projects, and Lisp game jams are a thing. Lisp-style languages are growing in popularity, while C is declining"

However, it was king of the hill back in the Lisp machine days, and leading up to it (which is why there was a commercial market for Lisp machines in the first place).
----
This era is gone. Being attached to it is akin to longing for vinyl records because of their "superior sound".

Abstract bullshit violates Zero-Overhead principles by costing performance/memory without explicit benefit to programmer:its already understood by JavaScript community and been key to improving performance while preserving productivity, unlike Lisp where performance is seen as secondary and productivity is hampered by crippled syntax(ironically slower and less expressive than JavaScript).

Never has somebody tried to talk so authoritatively about language technology, while being so demonstrably ignorant about everything within it.
-----
I don't need to be a chef to comment on taste of food.

You don't even know what "abstract bullshit" is or what it does. Lisp is one of the fastest languages on the planet. [delusional rants follow]
--------
Apparently while "one of the fastest languages" its losing all benchmarks and speed comparisons, which incidentally only use the C-like subset of Lisp, not idiomatic Lisp code(which would be at the end).
https://benchmarksgame.alioth.debian.org
The only Lisp compiler that approaches C performance is Stalin scheme, which implements only a subset of Scheme/Lisp. Being delusional about Lisp superiority doesn't change the facts on the ground: the only way to get a decent performance out of Lisp(and by extension any project written in lisp-like dialect) is to use minimal abstraction: essentially C/C++ code with Lisp overhead.

Name: Anonymous 2016-01-21 11:26

>>71
Make compile-time roman numeral literals in C.
There is no need to use _TokenOf()

#define I +1
#define II +2
#define III +3
#define IV +4
#define V +5
#define IX +9
#define X +10
#define XL +40
#define L +50
#define XC +90
#define C +100
#define D +500
#define M +1000
int i= M C L I;

Name: Anonymous 2016-01-21 11:34

>>74
That's not MCLI as a single token representing a number, as per the challenge. That's 4 tokens. Fail.

Come on, introduce actual new syntax with your shitty little preprocessor.

Name: Anonymous 2016-01-21 11:41

Do you even read English? Does using one word vs another even mean anything to you? You can't even respond to what I say.

mangled C macro
Totally doable in Lisp, and I explained the process to getting there. Desire for behavior → C fucks it up with weak-ass preprocessor shit → Unfuck it back to behavior → Nice Lisp macro that isn't a shit hack but does the same thing better. If I took the time to unmangle it, I would likely follow the same idioms in Lisp.

Fuck, I can implement the whole fucking C preprocessor in Lisp's macro system, just to shut you the fuck up about your shitty weak string masher.

No other Pascal dialect is C/C++ independent.
Pascal is a language that is not dependent on C, which is a completely different language, as you don't seem to know the distinction. Which part of the Pascal language depends on the C langauge? Be clear and specific, you laughably stupid piece of shit, or shut the fuck up and kill yourself.

Only in a contrived, virtual environment written in C/C++/asm.
Lisp takes my Lisp and converts it to machine code. You're contriving C into the picture because you're a desparate loser who can't admit that the old shit you've never learned past is being laughed at.

it patently obvious C and C++ have no signs of dying or being replaced.
Who even teaches C anymore in universities? Not that Java doesn't suck, but it's at the very least left common CS curriculum. Such is the way to becoming the next Cobol.

[popularity]
I know you can't read, but "is popular" ≠ "growing in popularity" anyway.

The era is gone
Again, because top popularity is driven by money, politics, and momentum. Which speaks nothing about the power, flexibility, ease of use, and speed of a language.

I don't need to be a chef to comment on taste of food.
You've never even tasted Lisp enough to make a determination. Reading between the lines, it's like you think it's still interpreted or something.

It's not fast
What the fuck are you smoking? Even just looking at the very first benchmark:

C:
secs N KB gz cpu cpu load
binary-trees 0.09 12 ? 706 0.07 0% 30% 80% 0%
binary-trees 1.61 16 9,492 706 1.60 100% 1% 2% 1%
binary-trees 37.71 20 132,384 706 37.68


Lisp:
secs N KB gz cpu cpu load
binary-trees 0.07 12 ? 612 0.06 100% 0% 0% 0%
binary-trees 1.36 16 77,168 612 1.35 1% 1% 1% 100%
binary-trees 32.77 20 325,092 612 32.71


Going down the line, Lisp takes the advantage often enough. Certainly it's competitive with C, thus "one of the fastest languages" would be apt for that competitiveness.

But this doesn't even begin to describe the speed advantages in the "real world", with large-scale programs where it actually counts. C gets slow when you use it in real systems, because it needs to dispatch, change runtime behavior based on user scripts/config/data, track shit that it doesn't know the lifetime of. Pull it into real world large scale and your C code is buggier, slower at runtime, and harder to write than Lisp, while dynamic runtimes eat those problems for breakfast.

Now, show me some fucking working roman numerals.

Name: Anonymous 2016-01-21 11:42

>>75
single token representing a number
It works anyway. Welcome to C.
introduce actual new syntax
There is no need since the C syntax isn't crippled enough.
You can do all things in void.h without any lisp, but you can't apply void.h to lisp because it would add Lisp overhead and produce the same result as C code.

Name: Anonymous 2016-01-21 11:47

Pascal is a language that is not dependent on C, which is a completely different language, as you don't seem to know the distinction.
Pascal as in abstractly delusional design document does not actually depend on C.
Pascal as implemented in real-world software does.
You could say Lisp doesn't depend on assembler(as language spec) but in reality it does(unless you evaluate lisp on paper).

Name: Anonymous 2016-01-21 11:51

who can't admit that the old shit you've never learned past is being laughed at.
In reality you're the one being laughed at, since you deny reality of using C/C++ software in your toy language by pretending it doesn't exist.

Name: Anonymous 2016-01-21 11:55

>>77
Fuck off, then. You know the preprocessor is shit, and the hacks people are doing with it are just playing with it for fun.

>>78
So Pascal written for, say, retro home computers that didn't run C doesn't exist. Or Symbolics Pascal that didn't involve C doesn't exist. Uh huh. I know your mind is warped to be C centric, because you're incapable of gaining skills being the most basic of Lego bricks, but the world is bigger than that.

I know you're probably going to type more, but as you can't pull your head out of C's ass for one second to discuss actual programming, and refuse to actually show me something with macros, you're not worth the replies.

Fuck off, and go piss about with your portable assembler.

Name: Anonymous 2016-01-21 12:01

retro home computers
Symbolics Pascal
Are historical curiosities. I could claim "an independent strong language that need no C" because in the past it was written in assembler and was unportable mess(BASIC - the ultimate language with no C/C++ baggage because at year XXXX it was free of C/C++ code).
portable assembler.
Portability is extremely useful long-term and allows C to outlive architectures - while arch-special snowflakes die with them.

Name: Anonymous 2016-01-21 12:16

You know the preprocessor is shit, and the hacks people are doing with it are just playing with it for fun.
void.h is used in writing real code.
boost Preprocessor is used in writing real code.
Chaos/Order libraries are used in writing real code
Lisp macros are used to write single-purpose toy programs to demonstrate the superiority of lisp.

Name: Anonymous 2016-01-21 12:22

Challenge:
implement p() from void.h textio.h subheader without using macros in C.
If preprocessor "used only for fun", there should be no problem using plain C without it.
syntax
p(args...) prints arguments to stdout.
example: p(1,2.3,"LITHP") results in:
1 2.3 LITHP

Name: Anonymous 2016-01-21 17:05

>>76
What the fuck are you smoking? Even just looking at the very first benchmark:
What are YOU smoking, or rather what are you looking at?
Binary trees is here:
https://benchmarksgame.alioth.debian.org/u64q/performance.php?test=binarytrees
Lisp is ~8 times slower than C.

Name: Anonymous 2016-01-21 17:20

>>84
Even if Lisp was faster than C in one benchmark its probably non-idiomatic port of C-like code.
Stalin Scheme reaches about near C performance this way:it optimizes scheme code into a chunk of tight C code thats than compiled/optimized by GCC and reaches native C speed.
That of course doesn't make mainstream Lisp faster: its just demonstrates Lisp/Scheme compiler writers create shitty optimizers and rely on GCC/Clang/etc to clean up their mess.

Name: Anonymous 2016-01-21 17:48

>>84
Looking at those particular results, C is pulling in Apache's memory pooling library, while Lisp is doing per-node allocation for tree nodes. Not quite apples to apples. Either both should be allocating nodes on an individual basis, or both should be using pooling libraries.

Name: Anonymous 2016-01-21 17:52

>>86
while Lisp is doing per-node allocation for tree nodes.
SBCL uses a GC.

Name: Anonymous 2016-01-21 17:56

The SBCL version isn't threaded, either.

Name: Anonymous 2016-01-21 17:59

>>87
Preallocated memory avoids malloc/free in non-GC languages, and avoids alloc/gc-pressure in GC languages. It's a reasonable manual optimization in any language.

Name: Anonymous 2016-01-21 18:03

Not a single lisper bothered to write a better version.
Haskellers on the other hand target benchmarks game site in the goals. see
https://wiki.haskell.org/Benchmarks_Game

Name: Anonymous 2016-01-21 18:04

PREALLOCATE MY ANUS!

Name: Anonymous 2016-01-21 18:06

I'm certainly not going to bother writing a faster version. How many language shootout sites are there?

But certainly memory pooling + threading on a quad core box is in the realm of an 8x speedup.

Name: Anonymous 2016-01-21 18:48

>>89
If SBCL's GC does not already optimize doing lots of small allocations it's a terrible fucking GC.

Name: Anonymous 2016-01-21 21:14

>>93
It does. The allocation is a small pointer bump & check, not even a function call. However, this benchmark exercises exhausting allocation pools and free/realloc cycles, potentially forcing sbrks or multiple TLABs in any style memory system. That's kind of why it's a benchmark, and that's why pooling would give it around a 2x speedup, sidestepping the entire system that it's attempting to benchmark.

Name: Anonymous 2016-01-21 21:22

>>94
In other words: GCs are shit, and any language or implementation based on one will be slow as shit.

Name: Anonymous 2016-01-21 21:43

>>95
Non-mempooled, GC will be faster than malloc/free especially for this kind of workload the benchmark is pushing.

Mempooled, the allocator is bypassed and it's the same between languages.

Name: Anonymous 2016-01-21 23:42

who /mem-pilled/ here?

Name: Anonymous 2016-01-22 0:05

>>97
I'm /meme-pilled/

Name: Anonymous 2016-01-22 0:08

My memory of shit memes drives me to claim this spot.

Name: Anonymous 2016-01-22 0:08

...and this one, too.

Name: Anonymous 2016-01-22 21:13

>>96
A valid implementation of malloc and free are:
static char mem[~0U];
static size_t nalloc;

void *malloc(size_t n)
{
void *p = mem+nalloc;
nalloc += n;
return p;
}

void free(void *p)
{
// no-op
}

The standard imposes to real semantic requirements on these functions.
malloc() is allowed to fail at any time.

Name: Anonymous 2016-01-22 21:21

>>101
nice memory leak.

but in all seriousness, I like this code a lot. It's a great first step and it's so obvious how one could go further with it: Just add an extra array that lists all the allocated blocks and whether they're used or not - then you have copy & compact (like a GC!) every few thousand free's.

Name: !OUKY5mcbp6 2016-01-22 21:53

>>102
yeah, and then all your pointers will be invalid.
retards like you is why software is so bad

Name: Anonymous 2016-01-23 0:06

>>104
Patching them up is faster than free() on every little thing.

Name: !OUKY5mcbp6 2016-01-23 0:20

>>104
you are actually braindead

Name: Anonymous 2016-01-23 0:26

>>104
don't reply to tripfags

Name: Anonymous 2016-01-24 18:13

>>106
shut the fuck up you moron nobody gives a flying fuck about your childish imageboard drama

Name: Anonymous 2016-01-24 19:50

>>107
what are you talking about?

Name: Anonymous 2016-01-24 19:51

>>107
this is a textboard

Name: Anonymous 2016-01-25 23:57

>>108-109
``tripfags'' are the imageboard's favorite boogeymen

Name: Anonymous 2016-01-26 2:46

>>110
``tripfag''? you mean, someone who gets trips?

Name: Anonymous 2016-01-26 2:49

>>11111111111111111111111

NNNIIICCEEE TTTRRIIIPPPSSS

Name: Anonymous 2016-01-26 3:03

>>108,109
Why did you reply twice?

>>112
Who are you quoting?

Name: Anonymous 2016-01-26 3:20

>>113
oh my leg gosh EGGS DEE!!!! rofl HAHAHAH le `WHOM` are yo le queoting face XDDDDD

Name: Anonymous 2016-01-26 6:11

>>111
Nice trips

Name: Anonymous 2016-01-26 8:19

THE DEFENDERS OF THE SOUTH SQUADRON GET

Name: Anonymous 2016-01-27 10:19

>>113
being an anus

Name: Anonymous 2016-01-27 19:05

>>117
That's not what I said!

Don't change these.
Name: Email:
Entire Thread Thread List