Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Can GHC become self-conscious?

Name: Anonymous 2016-03-03 19:26

As you know, GHC is compiled by itself. GHC also comes with an optimization engine. GHC's code optimization therefore optimizes itself. It is also a rather aggressive optimizer.

How long do you think we have until it becomes self-aware, recursively self modifying itself down to the point that it can instantly rewrite programs to run in O(1), having solved P=NP and so other computational problems?

Do you think 6 to 8 weeks is a reasonable estimate?

Name: Metempsychosis. 2016-03-03 20:30

Name: GHC 2016-03-03 20:31

{-# NoExceptions #-}

Name: Anonymous 2016-03-03 20:42

>>2
This reminds me of the late Timecube website.

Name: Anonymous 2016-03-03 20:48

Opposites create the universe.
Opposites compose the Earth.
Opposites compose humanity.
Opposites create your body.
Opposites de-god academia.
Opposites de-god singularity
taught by religious/academia.
I can call singularity educators the most
putrid name on Earth and claim they eat
cow-dung ambrosia, but the lying ass
bastards will not even object - for they
know I am right and that any debate will
indict them for the evil they perpetuate
against the students and future humanity.

Name: Anonymous 2016-03-03 20:50

A mother and baby are the same age, as
a 1 day old baby has a 1 day old mother.
******************************************************************
I am not allowed to lecture at the word animal
academic institutions, for they fear my wisdom
will expose and indict the pedant hirelings as
betrayers of dumb-ass students - the dung heads
who allow their freedom of speech to be
suppressed without a whimper, unbelieveable.
Word animals will feel the wrath of Cubic curse.

Name: Anonymous 2016-03-04 3:07

>>1
The same is true of Lisp, Forth, C, C++, and likely a bunch of other languages as well.

Name: Anonymous 2016-03-14 13:42

>>7
That's a statement about implementation you uneducated semitic polak.
And yes the same is theoretically true for every language whose implementors aren't a complete pile of shit viz. FIOC

Name: Anonymous 2016-03-14 13:59

It can never become O(1). Did you work with limits in high school? It's like this. Every time you compile GHC with itself it becomes faster and faster going closer to 0 seconds however it can never reach 0 seconds.

Name: Anonymous 2016-03-14 15:06

I would expect the C optimizing compilers to get there first before Haskell.

Name: Anonymous 2016-03-14 20:00

>>10
Check 'em

Name: Anonymous 2016-03-15 15:09

>>11
> tfw off by one

Name: Anonymous 2016-03-15 19:27

/home/anus/prog/shitty/fuck/src/HMA/Meme/Fan.hs:26:1:
No context is allowed on a GADT-style data declaration
(You can put a context on each contructor, though.)


HOW DID IT KNOW??!? Surely the singularity is upon us!

Name: Anonymous 2016-05-30 22:36

>>9
O(1) doesn't mean it takes zero time, it means that time does not increase with increasing input size.

Name: Anonymous 2016-05-31 7:28

>>14
going closer to C seconds however it can never reach C seconds.
Better?

Name: Anonymous 2016-05-31 8:12

>>15
You have no idea what you're talking about.

>>8
Neither do you

Name: Anonymous 2016-05-31 9:08

>>15
Who are you quoting?

Name: Anonymous 2016-05-31 16:23

It isn't possible to make an arbitrary algorithm run in O(1), unless you make it always take the time required for the input of maximum size. Any algorithm that's normally O(N) or worse can only be made to run in O(1) by doing more work than necessary in most cases. And even then, computational complexity theory assumes there is no maximum input size, which means converting most algorithms to O(1) would mean they take infinite time.

Name: Anonymous 2016-05-31 16:57

>>18
N has nothing to do with the "input size", you retard.

Name: Anonymous 2016-05-31 17:43

>>16
It's correct though.

Name: Anonymous 2016-05-31 17:52

>>18
Wouldn't a program that doesn't halt in all cases have an undefined complexity? Anyway, that doesn't work. No matter how steep your slope is, you can't make a linear function greater than a quadratic or exponential as it approaches infinity. Higher order functions will always overtake lower order functions.

>>19
N literally is ``input size,'' you troglodyte.

Name: Anonymous 2016-05-31 17:57

>>21
Oh, and with regards to O(1), no that doesn't even make sense. No matter how much extra work you do, a line that goes ``up'' will eventually go past it.

Well, not always but you get what I mean. Check `em.

Name: Anonymous 2016-05-31 18:12

higher order functions will always overtake lower order functions

Yes, if the input size is unbounded, which it is always assumed to be within the theoretical field of computational complexity. In reality it isn't, but generally it's still high enough that order makes more of a difference than coefficients (which is why all coefficients are assumed to be 1). A higher-order function may run faster on small inputs than a lower one (again, due to coefficients), but that isn't what computational complexity is about.

Name: Anonymous 2016-05-31 19:18

>>23
In reality it isn't
Sure, but if we're talking about practical applications then ``making the program run slower so it doesn't grow with any reasonable input'' isn't exactly... practical.

Name: Anonymous 2016-05-31 19:51

>>24
Yes, that's what I said earlier, if input is bounded then it most certainly is possible to make any algorithm run in constant time, simply by making it always take the time it would need for max input. But obviously there's no practical benefit to that. And, in real life there is the possibility of a higher-order algorithm with a low coefficient to outperform a low-order algorithm with a high coefficient, on inputs of up to a given size.

Name: Anonymous 2016-05-31 20:16

>>25
Ok then what's the point? It's not practical nor is it theoretically interesting.

Name: Anonymous 2016-06-01 7:42

>>18,21
N is the data size, not the input parameter to be worked on. The idiocy comes from the assumption that the ```input''' needs to be fully walked, making this broken argument against O(1). Algorithms don't need to walk the entire data set, unless you're a shit programmer, or have shit understanding of computer science.

F-

Name: Anonymous 2016-06-01 9:36

>>27
Algorithms don't need to walk the entire data set, unless you're a shit programmer, or have shit understanding of computer science.
some algorithms don't, others do. or maybe you can show me O(C) sorting algorithm which doens't walk the entire data set.

Name: Anonymous 2016-06-01 12:14

>>28
I could, but I'm still working on getting it published, so I won't.

Name: Anonymous 2016-06-01 17:08

Algorithms don't need to walk the entire data set, unless you're a shit programmer,

Depends on the algorithm. An algorithm to print all values greater than 10 in an integer array has to iterate through the whole array. And even algorithms that don't need to iterate through an entire array often have an average timing proportional to the array size.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List