Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon.

Pages: 1-

Unix Philosophical Thinking

Name: Anonymous 2017-07-22 1:12

1. Rule of Modularity: Write simple parts connected by pipes, sed, grep, and shell scripts.

2. Rule of Clarity: Clarity is better than cleverness.

3. Rule of Composition: Design programs to output text streams to other programs.

4. Rule of Separation: Separate policy from mechanism; separate interfaces from engines.

5. Rule of Simplicity: Design for simplicity; add error handling only where you must.

6. Rule of Parsimony: Write a big program only when it grew from a small program.

7. Rule of Transparency: Design for C programmers to make inspection and debugging easier.

8. Rule of Robustness: Robustness is the child of abort and stderr.

9. Rule of Representation: Fold knowledge into data so program logic can be stupid and robust.

10. Rule of Least Surprise: In interface design, always do the backwards compatible thing.

11. Rule of Silence: When a program has nothing surprising to say, it should say nothing.

12. Rule of Repair: When you must fail, abort and dump core.

13. Rule of Economy: Programmer time is expensive; conserve it in preference to machine time.

14. Rule of Generation: Avoid hand-hacking; write programs to write programs when you can.

15. Rule of Optimization: Prototype before polishing. Get it popular so you can get others to optimize it.

16. Rule of Diversity: Distrust all claims for “one true way” unless it's the Unix philosophy.

17. Rule of Extensibility: Design for the PDP-11, because the future will never come.

Name: Anonymous 2017-07-22 13:07

http://git.suckless.org/blind Only way= the Unix Way

Name: Anonymous 2017-07-22 16:25

FORTRAN was proposed by Backus and friends, and again was opposed by almost all programmers. First, it was said it could not be done. Second, if it could be done, it would be too wasteful of machine time and capacity. Third, even if it did work, no respectable programmer would use it -- it was only for sissies!

Name: Anonymous 2017-07-22 17:23

>>3
UNIX is the anti-FORTRAN. FORTRAN was a huge advancement in compiler technology and high-level programming. It led to new hardware features.

Allen: That’s right. Well, it was an unhappy class. But in the end, it was an amazing experience for all of us because Fortran was not only a language, but they had provided a compiler which was extremely advanced, and laid the foundations for the structure of compilers today.

...

Allen: Yes. And because the memory-latency problem was being solved by a lot of concurrency in the hardware—very complex concurrency. And the memory organization itself was multiway-interleaved and it was unpredictable what order data would be delivered to the computational unit. Six accesses could be in flight at the same time. There were pipelines in the computational unit itself and there was an ability for multiple instructions to be in execution at the same time. And the most complicated unit on the machine was a look-ahead unit, because they had precise interrupts as part of the architectural design, so not only did it have to keep track of all the concurrency going forward, but they had to back it out when there was an interrupt.

It was an extremely complicated machine and a wonderful one to program. The compiler had a very big challenge in order to take advantage of it. It was a wonderfully challenging project.

So a bunch of us were drafted out of Research to come and work on the compiler and the operating-system software itself. The compiler itself was as grandiose as the machine. I ended up, because of my previous exposure to the Fortran optimizer, involved with the optimizer for the Stretch machine—the Stretch Harvest, as it turns out. The outlines of the compiler were established by a different committee but there were four of us who were given the charge of filling in the details, including the interfaces in the compiler and what the specs were for that and taking charge of the different pieces of it. I had the optimizer, and somebody else had the parser, the register allocator, and the interface with the assembly program.

...

Seibel: When do you think was the last time that you programmed?

Allen: Oh, it was quite a while ago. I kind of stopped when C came out. That was a big blow. We were making so much good progress on optimizations and transformations. We were getting rid of just one nice problem after another. When C came out, at one of the SIGPLAN compiler conferences, there was a debate between Steve Johnson from Bell Labs, who was supporting C, and one of our people, Bill Harrison, who was working on a project that I had at that time supporting automatic optimization.

The nubbin of the debate was Steve’s defense of not having to build optimizers anymore because the programmer would take care of it. That it was really a programmer’s issue. The motivation for the design of C was three problems they couldn’t solve in the high-level languages: One of them was interrupt handling. Another was scheduling resources, taking over the machine and scheduling a process that was in the queue. And a third one was allocating memory. And you couldn’t do that from a high-level language. So that was the excuse for C.

Seibel: Do you think C is a reasonable language if they had restricted its use to operating-system kernels?

Allen: Oh, yeah. That would have been fine. And, in fact, you need to have something like that, something where experts can really fine-tune without big bottlenecks because those are key problems to solve.

By 1960, we had a long list of amazing languages: Lisp, APL, Fortran, COBOL, Algol 60. These are higher-level than C. We have seriously regressed, since C developed. C has destroyed our ability to advance the state of the art in automatic optimization, automatic parallelization, automatic mapping of a high-level language to the machine. This is one of the reasons compilers are… basically not taught much anymore in the colleges and universities.

Seibel: Surely there are still courses on building a compiler?

Allen: Not in lots of schools. It’s shocking. there are still conferences going on, and people doing good algorithms, good work, but the payoff for that is, in my opinion, quite minimal. Because languages like C totally overspecify the solution of problems. Those kinds of languages are what is destroying computer science as a study.

Seibel: But most newer languages these days are higher-level than C. Things like Java and C# and Python and Ruby.

Allen: But they still overspecify. The core thing is that it specifies location of data. If you look at these other languages, they stayed away from specifying the location of data and how to move it, where to put it in the machine. It was ultimately about its value at any point.

Name: Anonymous 2017-07-22 19:13

>But they still overspecify. The core thing is that it specifies location of data. If you look at these other languages, they stayed away from specifying the location of data and how to move it, where to put it in the machine. It was ultimately about its value at any point.
I don't get it, is he against static typing???
If not, type values are interchangeable to any degree with type punning:
C has very loose type conversion rules.

Name: Anonymous 2017-07-22 19:27

int a=-1;//specifying the location of data and how to move it
char* thirdbyteofa=((char*)(&a))+2;
float* floatvalueofa=(((float*)&a));
void* rawpointerofa=(void*)&a;

Name: Anonymous 2017-07-22 21:26

>>6
Also structure packing.

Name: Anonymous 2017-07-22 21:34

Un-Philosophical Thinking

Don't change these.
Name: Email:
Entire Thread Thread List