Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Why has computing has not advanced past the sixties?

Name: Anonymous 2015-01-17 5:00

Computer science, which has little to do with either computers or the scientific method, has progressed (or at least, laterally drifted around), of course, but computing has not. By that I mean the physical implementations of machines that can perform the process calculi that computer science dreams up. Back in the sixties, there was already talk of crafting binaries that contained data dependency information so that the processor could do unrestricted out of order execution. We still don't have that, despite overcoming the memory limitations of that time many times over. So many promising new innovations, but few even tried.

Are engineers just lazy? Did IBM, Intel, and Microsoft intimidate everyone enough to never focus on anything but their shitty, low-end trash?

Name: Cudder !MhMRSATORI 2015-01-17 5:17

binaries that contained data dependency information so that the processor could do unrestricted out of order execution
They all do... and apparently you've not noticed the small revolution that occurred with the P6 and more recently, Core?

Unless you're referring to VLIW, which was an Itanic failure.

Name: Anonymous 2015-01-17 21:39

>>1
It has advanced past the sixties, my dear friend.

The problem is that the viscous Jews who run all of the tech companies are keeping the discoveries made by their European employees a secret for themselves. They're crafting themselves YHWH-class oracles using memresistive quantum computers capable of super-Turing computation, while shoveling the scraps towards the masses to keep us enslaved in shekel debt.

That's right, the Jews are exploring the mathematical Megaverse, simulating new realities for profit and leisure--creating digital heavens for themselves--while the rest of us are left to live as slaves on the gravity well known as Earth.

Name: Anonymous 2015-01-17 21:59

>>3
Lain will intervene.

Name: Anonymous 2015-01-17 22:02

>>4

Still waiting on Protocal 7

Name: Anonymous 2015-01-17 23:50

>>2
Itanium was the classic Intel failure - the hardware designers created something that no one could write good software for, and then congratulated themselves for it. VLIW has formed a decent niche in DSP and GPU applications so it's clearly useful if the system designers know what they're doing.

Name: Anonymous 2015-01-18 0:50

>>3
I can do that in my head.

Name: Anonymous 2015-01-18 3:45

>>7
Canst thou hax my anus in thine head?

Name: Anonymous 2015-01-18 6:39

A lot of relatively recent parallel computing theory is now applied to GPGPU. Kiketel is now putting FPGA on some of their chips. Too bad they cost $50k each, but hopefully the price will come down and AMD will offer something similar (unlikely though, since they are fabless) (and ARM competes for pennies for phone contracts, so they won't do it ever), so we may one day be able to experiment with little Lisp machines in hardware. There was Cell, but Sony let it die and most programmers are idiots who never realized it's potential. Now the market for supercomputers is dead outside of weather forcasting and other government shit, so neat new paradigms are unlikely to be tried since even the most minute differences in floating point treatment ruins weather models. They may see a comeback for intensive, non-paralellizable cases, but otherwise the trend of throwing tons of the cheapest shit at the problem is probably going to continue. Dataflow is dead and will remain dead until realizes the usefulness of it in databases Oracle patents it and starts selling $50M machines to idiotic people who need a 5% speed increase in their queries.

Name: Anonymous 2015-01-18 6:53

uh idiot??? Every heard of python? Get your head out of the sand, ALOT has improved since 60 years... I just wrote an IRC bot in under a thousand lines becuase of python librarys.

Name: Anonymous 2015-01-18 9:05

>>9
I think you're going to have to clarify what you mean by "supercomputers". Asymmetric multiprocessing and non uniform memory architectures are probably inevitable in the long run, but I wouldn't hold my breath for them to become suddenly popular.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List