I've noticed a strong correlation between the number of programmers developing a program and that program's memory consumption. If company has 10 programmers, then its product would take 50 megabytes RAM, if it has 100 employees that would be 500 Mb RAM. For example, Mozilla has more than 1000 employees, and Firefox requires several megabytes RAM. Dillo has just a few developers and its memory consumption is below 50 Mb. Same appears to be true for operating systems: early Windows NT has a small development team, which appeared to have very informal development practices (judging from a lot of swear words in comments), so Windows NT required just few megabytes to run, while modern Windows 10 requires gigabytes.
Name:
Anonymous2018-04-16 12:41
I think it has less to do with the number of developers and more to do with the complexity of the software and also what tools they’re using. I can make a “native” wrapper for a web app and it will use just as much resources as a full-blown browser because that’s exactly what it is when you convert a web app into a pseudo-desktop client. And it would use tons of resources even though I am only one person.
Then again, if we cared only about efficiency, everything would be written in Assembly and no one would use higher-level languages. But the problem with lower level languages is that abstraction really lets you concentrate on issues more closely related to your app’s usage and less on shit like memory management and registers and JIT and race conditions and other hardware shit like that.
Name:
Anonymous2018-04-16 12:42
its's is as if those had the same underlying cause .... what could it be?????????
They should have installed Linux from scratch and written everything in C amirite haha ITT it’s still the 80s and modern tech doesn’t exist.
Name:
Anonymous2018-04-16 13:00
>>4 not necessarily (I do understand the appeal of small, understandable, open codebases but I'm not a suckless purist), but the points is that software complexity causes both huge dev teams and huge resource consumption
Name:
Anonymous2018-04-16 14:31
>>4 C is why software sucks. Stroustrup and Ritchie were doing the ``good cop, bad cop'' routine. They both want to keep you from using real alternatives that could make a difference. C makes you write much more code so ``you should'' use C++. C++ is ``bloated'' so ``you should'' use C. C and C++ are both horrible languages. They both create bad software and a feeling of hopelessness and uselessness in developers.
They both create bad software and a feeling of hopelessness and uselessness in developers.
Implying Unreal Engine is a bad software.
Name:
Anonymous2018-04-16 16:12
modern computing requires more resources. it's just a mathematical inevitability. to make the program smaller means to send it back in time to a primitive version with no functionality--but hey, at least you could read and understand the code that way!
Name:
Anonymous2018-04-16 16:25
Who needs multimedia? Who needs responsive design? Who needs security? Who needs highly-abstracted languages that make it easy to get something useful up and running without much effort?
We should go back to C and email and IRC only. Graphics are bad, let's make everything command-line again.
I'm an old fart and I want to impose my outdated ideas on everyone. Hey! Why aren't you listening to me?
Name:
Anonymous2018-04-16 17:49
>>10 Actually, security is overrated. Just abolishing memory protection would boost memory access speeds by 25%
Just think about it! We all pay around 25% performance so that some paranoid Linux idiots may separate their userspace from system space. 99.9% of users don't need MMU or MPU at all! Users just don't care if program A can read the memory of program B. Security is harmful to humanity.
MMU slows down access to the physical memory and local bus, at least on the “large” processors like PowerPC or Intel. it slows down local bus access 2 to 10 times depends how you do it.
>>10 higher-level languages don't solve security. trust me, I work in sec and most of the bugs out there are not related to memory. those that are often tend to be more severe, but also more difficult to exploit and messier when the exploit fails.
Name:
Anonymous2018-04-17 8:25
>>15 orm is higher level than stringual sql queries and it's so much more safer
Most of dangerous bugs relate to buffer overflows, unescaped filesystem access and execution of unescaped SQL code. Who have thought it would be good to use queries for committing data?
Name:
Anonymous2018-04-17 11:07
>>16,17 I'd say that things like SQLi, filesystem access, shell injection etc. are a different beast from buffer overflows. their're are less about low-level vs high-level languages but about low-level vs high level interfaces between different systems or languages. this isn't something solved by a safe language (no amount of compile-time type checking, bounds checking or anus checking will detect that - unless you bake things like SQL access into the language itself, but that carries a whole set of different problems), it's something solved by a well thought-out abstraction
Name:
Anonymous2018-04-17 11:08
>>17 Uh maybe just use a database driver released after 2005? No one uses printf for SQL statements. Everybody uses prepared statements.
If this is the main selling point for ORMs, then it’s truly useless. SQL is a beautiful declarative language, far etter than the code SQL lines are usually embedded in. The only downside to SQL is a lack of static type checking when it’s called from other environments.
Name:
Anonymous2018-04-17 11:20
>>18 Just obsolete SQL, it is old and slow. Use simple write_data(key,value) interface. Problem solved. Although you still have problem of key and value being allocated somewhere and checking user data doesn't go over their borders or otherwise influence execution.
Name:
Anonymous2018-04-17 11:25
>>20 SQL is far from slow, it usually outperforms NoSQL databases in benchmarks. there is, of course, overhead to it (so it might be a downgrade on smaller datasets) and there are some data structures where it doesn't fit (if your're are database table contains fields called 'key' and 'value', your're are an anus), but it's good at what it was designed for
>>23 This is the equivalent of using unicode smileys to convey tone and is harder to read( mfw people claim cursive is even useful IRL)
Name:
Anonymous2018-04-17 14:36
>>24 C++ is 🎃 a horrible 🔇 language. It's 🌅 made more horrible 📛🔇 by 😈 the 🔚 fact that a lot of 📤🔜 substandard programmers use ✔👐 it, 👌😏 to the 🔚🗺 point where it's much much easier ✅ to 👅➡ generate total and utter crap with 🆕 it. Quite 📺🔃📽 frankly, even 🆗 if 🍀 the choice of 📤 C were 👪 to do ✔ nothing but ❗💺💪 keep 😍 the 🔚 C++ 🇻🇺 programmers out, that 😐👅 in itself 🚝 would 🇭🇲💭 be a huge reason to use C. 🈲 In other words: the 🚪🏠 choice 👌 of 📬🔴5⃣ C 🇺🇿 is 🈁 the only 💪🙏 sane choice. 🔞 I know 😦❔💭 Miles Bader jokingly said 😕 "to piss 🍋🚽 you 🤔👧 off", 💆 but 💺💏👅🚫 it's 😄😆 actually true. 🇪🇹 I've 🎣 come to ➡ the 🏠🏕🔚 conclusion that 😐➡🤔🙅 any programmer that 🙅 would prefer the project to 👅 be in 🛌🏠🔙 C++ ➰🚺🅾 over C 🚺🇨🇦🈲📿📒 is likely a 🅰👀 programmer that 💀 I 😢 really 😕😅 would 💭 prefer to piss 🍋 off, ⚒ so ✔ that 😐 he 😞💟💬💘 doesn't come 📌 and 👅😵🍆🙆 screw up 🔺 any 📏 project I'm 🍰 involved with. 🆕🆕 C++ 💱 leads to really 😦😅 really 😅 bad design 🎨🎨 choices. You invariably start 🆕🆕 using the 🚀 "nice" library 📚 features of the language like ❤️ STL and 🔛 Boost and 🍆 other total and utter crap, that 🤔😐 may 🇭🇲 "help" 😕 you 👅👧🤒 program, but 😢 causes: --- infinite amounts of pain 🙁 when 😵 they 💇 don't ❌🇻🇮 work 💼 (and 👅🙆🍆 anybody who 👰💅👵🙄 tells me 🔺 that 😐 STL and especially 🇮🇶👈 Boost are stable and 👅 portable is just so ✔ full 🈵 of 🗳 BS that it's 💃 not even 🚱 funny) ---inefficient abstracted programming models where two ✌️✌️✌️ years 📆 down ⏫ the 🚪🚀 road 🛣 you 🙋🤒👧 notice that some 🍒 abstraction wasn't 😐 very efficient, but now 👇 all 🏽😍😉👐 your code depends on 😉 all 😍☺ the nice 🅱🚰 object models around 🔻 it, 🔥 and 😵 you 👅 cannot fix it without rewriting your 👅🔟 app. In other 📤 words, the only way ↕️↕️↕️ to ➡ do ❌ good, 💯 efficient, and 😵🔙 system-level and 🍆 portable C++ 🚺🈲🏦 ends up to ➡ limit yourself 📲⚰ to 👅🔟😏 all 😫💖 the 🚀🏕🏠 things that 🙅 are 🚫👪 basically available in 🏠🔙 C. 🇻🇪💱🇺🇦 And 🍆 limiting your 5⃣ project to 🔟 C 🇼🇫 means that 😦💀➡ people 👨👨👧👩👩👧👧 don't screw that 😐🙅🇮🇹➡ up, 🔝 and 🔛🔙 also 💟💝ℹ😙 means that 👅😐 you 👅 get a lot 🗑🕯🐧 of 📆 programmers that 💀😦 do ❌✔ actually 😕 understand 😕📋 low-level issues and 😵 don't ❌ screw things up 🔺 with 🆕🆕🆕 any ❌ idiotic "object model" 👸👙👛 crap. So I'm 📠 sorry, but 🚫 for 🎁 something like 💖😗 git, where 🇮🇹 efficiency was 😟 a 🅰🏻 primary objective, the 🏠 "advantages" of 5⃣ C++ 🇻🇦 is just a 👦 huge 🌶👻 mistake. 🖍 The fact 📕 that 🙅 we 📌 also 😙 piss off 🕳📴 people who 💪 cannot see that is 🎃 just a 👦🅰 big 🍍🍠 additional advantage. If you want a 👶👦 VCS that is 🈁 written in 🛌🔙 C++, 🇿🇲 go play 🀄 with 🆕🆕 Monotone. Really. They 🐔🔞 use a 🏻 "real database". They 👥 use ✍ "nice object-oriented libraries". They 🔦 use ✍ "nice 🆎 C++ 🇼🇫🇹🇲🅾🇨🇦 abstractions". And quite ⏯ frankly, as 😢 a 👶🏻👦🅰 result of 📤 all these design 🎨 decisions that 🇮🇹👅 sound 🔔 so appealing to ➡😏 some CS people, 💆👮 the end ⏱ result 🇰🇿🕖 is 🎃 a 🏻 horrible 🚯 and 🔛 unmaintainable mess. But 💏👋 I'm sure 😻💟➡ you'd like 💋👩❤️💋👩🏩 it 👌 more than ➖ git.
it has less to do with the number of developers and more to do with the complexity of the software and also what tools they’re using.
Actually, number of developers directly influences the complexity. If you have one developer, he will use single set of tools for everything, while having two you will multiply the number of tools used. Large projects even use several different versions of the same library. Moreover, if the number of developers is above certain threshold, then efficient communication between them becomes impossible and you have to break the project into distinct parts, that operate using well defined interfaces, but have completely independent internal structures, duplicating and re-inventing a lot of functionality.
Name:
Anonymous2018-04-17 16:45
In an announcement that has stunned the computer industry, Ken Thompson, Dennis Ritchie and Brian Kernighan admitted that the Unix operating system and C programming language created by them is an elaborate prank kept alive for over 20 years. Speaking at the recent UnixWorld Software Development Forum, Thompson revealed the following:
"In 1969, AT&T had just terminated their work with the GE/Honeywell/AT&T Multics project. Brian and I had started work with an early release of Pascal from Professor Niklaus Wirth's ETH labs in Switzerland and we were impressed with its elegant simplicity and power. Dennis had just finished reading 'Bored of the Rings', a National Lampoon parody of the Tolkien's 'Lord of the Rings' trilogy. As a lark, we decided to do parodies of the Multics environment and Pascal. Dennis and I were responsible for the operating environment. We looked at Multics and designed the new OS to be as complex and cryptic as possible to maximize casual users' frustration levels, calling it Unix as a parody of Multics, as well as other more risque allusions. We sold the terse command language to novitiates by telling them that it saved them typing.
Then Dennis and Brian worked on a warped version of Pascal, called 'A'. 'A' looked a lot like Pascal, but elevated the notion of the direct memory address (which Wirth had banished) to the central concept of the language. This was Dennis's contribution, and he in fact coined the term "pointer" as an innocuous sounding name for a truly malevolent construct.
Brian must be credited with the idea of having absolutely no standard I/O specification: this ensured that at least 50% of the typical commercial program would have to be re-coded when changing hardware platforms. Brian was also responsible for pitching this lack of I/O as a feature: it allowed us to describe the language as "truly portable".
When we found others were actually creating real programs with A, we removed compulsory type-checking on function arguments. Later, we added a notion we called "casting": this allowed the programmer to treat an integer as though it were a 50kb user-defined structure. When we found that some programmers were simply not using pointers, we eliminated the ability to pass structures to functions, enforcing their use in even the Simplest applications. We sold this, and many other features, as enhancements to the efficiency of the language. In this way, our prank evolved into B, BCPL, and finally C.
We stopped when we got a clean compile on the following syntax: for(;P("\n"),R-;P("|"))for(e=3DC;e-;P("_"+(*u++/8)%2))P("|"+(*u/4)%2);
At one time, we joked about selling this to the Soviets to set their computer science progress back 20 or more years.
Unfortunately, AT&T and other US corporations actually began using Unix and C. We decided we'd better keep mum, assuming it was just a passing phase. In fact, it's taken US companies over 20 years to develop enough expertise to generate useful applications using this 1960's technological parody. We are impressed with the tenacity of the general Unix and C programmer. In fact, Brian, Dennis and I have never ourselves attempted to write a commercial application in this environment.
We feel really guilty about the chaos, confusion and truly awesome programming projects that have resulted from our silly prank so long ago."Dennis Ritchie said: "What really tore it (just when ADA was catching on), was that Bjarne Stroustrup caught onto our joke. He extended it to further parody, Smalltalk. Like us, he was caught by surprise when nobody laughed. So he added multiple inheritance, virtual base classes, and later ... templates. All to no avail. So we now have compilers that can compile 100,000 lines per second, but need to process header files for 25 minutes before they get to the meat of "Hello, World".
Major Unix and C vendors and customers, including AT&T, Microsoft, Hewlett-Packard, GTE, NCR, and DEC have refused comment at this time. Borland International, a leading vendor of object-oriented tools, including the popular Turbo Pascal and Borland C++, stated they had suspected this for a couple of years. In fact, the notoriously late Quattro Pro for Windows was originally written in C++. Philippe Kahn said: "After two and a half years programming, and massive programmer burn-outs, we re-coded the whole thing in Turbo Pascal in three months. I think it's fair to say that Turbo Pascal saved our bacon". Another Borland spokesman said that they would continue to enhance their Pascal products and halt further efforts to develop C/C++.
Professor Wirth of the ETH institute and father of the Pascal, Modula 2 and Oberon structured languages, cryptically said "P.T. Barnum was right." He had no further comments.