Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Rate My Awesome Cryptographic Hash Algorithm

Name: Anonymous 2014-05-25 20:36


unsigned long hash(unsigned char *str)
{
unsigned int hash = 0;
int c;

while (c = *str++)
{
hash += c;
}

return hash;
}


Spent hours designing it. It is safer than md5!

Name: Anonymous 2014-05-25 20:45

i'll make sure all my files add up to the same value from now on

Name: Anonymous 2014-05-25 20:46

Spent hours designing it. It is safer than md5!

heh

epic

Name: Anonymous 2014-05-25 20:55

she EPIN /prog/!

Name: Anonymous 2014-05-25 21:01

>>3
heh
heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh heh

Name: Anonymous 2014-05-25 21:10

There is one and only one way to produce a good hashing function: use a table, sampling bits from all over the hashed buffer to the hash-size-bits values, then xoring them. Make sure that mapping table is really random and have no preconceptions. crc32 is a simplified versions of this algorithm.

Name: Anonymous 2014-05-26 1:54

how about this one?


unsigned long hash(unsigned char *str)
{
unsigned int hash = 0;
int i;

srand(time(NULL));

for (i = 0; i < 100; i++) {
hash += rand();
}

return hash;
}

Name: Anonymous 2014-05-26 2:40

>>7
int rand(void){return 0;}

Name: Anonymous 2014-05-26 2:49

>>7

hash := method(str,
str asList map(x, Random setSeed(x byteAt(0)) round) \
reduce(x,y, x ^ y))

Name: Anonymous 2014-05-27 10:28

Motif's are good..


function [layer1, layer2] = sponge5(input, layer1, layer2, jarX, jarY, jarZ)

layer1 = round4(input, layer1, jarX, jarY, jarZ);

layer2 = round4(input, layer2, jarX, jarY, jarZ);

layerMid = round4(layer1, layer2, jarX, jarY, jarZ);

layer1 = round4(layerMid, layer1, jarX, jarY, jarZ);

layer2 = round4(layerMid, layer2, jarX, jarY, jarZ);

endfunction;

function blockmid = round4(blockA, blockB, jarX, jarY, jarZ);

blockmid = compress(blockA, blockB, jarX, jarY, jarZ);

blockB = compress(blockB, blockmid, jarZ, jarX, jarY);

blockA = compress(blockA, blockmid, jarY, jarZ, jarX);

blockmid = compress(blockA, blockB, jarX, jarY, jarZ);

endfunction;

function x = compress(blockA, blockB, jarX, jarY, jarZ)

bA2 = blockA(jarX);

bB2 = blockB(jarY);

bA3 = bA2(jarY);

bB3 = bB2(jarX);

v = cellOp(blockB, bA2, jarX, jarY, jarZ);

w = cellOp(bB2, bA3, jarX, jarY, jarZ);

v = cellOp(blockA, v, jarX, jarY, jarZ);

w = cellOp(bB3, w, jarX, jarY, jarZ);

x = cellOp(v, w, jarX, jarY, jarZ);

x = mixblock(x, jarX);

endfunction;

Name: Anonymous 2014-05-27 10:30

>>10
GIVE UP, LUKE. YOU CAN'T PROGRAM.

Name: Anonymous 2014-05-27 20:41

Thinking about hash algorithms, it is astoundingly how everything is related: we use mostly the same techniques to compute hashes and to reduce dimensions in AI algorithms. Mammal brain too reduces it similarly, because dendritic tree is basically a hash function, so, in a nutshell, brain is mostly a big hashtable.

Name: Anonymous 2014-05-27 20:49

>>12
Dendritic trees are a lot more than a hash function, and they aren't even responsible for the significant hashing-like behaviour in the brain.

Have you been polluting your mind with OCW or something?

Name: Anonymous 2014-05-27 20:50

I like how you guys just seem to know about everything.

Name: Anonymous 2014-05-27 21:15

>>13
Dendritic trees are a lot more than a hash function
How exactly? Of course the learning algorithm is unknown, but it's result is just a hash function, whose key feature is mapping high-dimensional data (dendritic tree samples up to 100,000 bits) to 1 bit. In effect brain can quickly determine correlations between up to 100000 different entities (i.e. it can compare up to 100,000 human faces at once).

Name: Anonymous 2014-05-27 21:17

>>14

I know about everything.

Name: Anonymous 2014-05-27 22:29

>>15
So many tacit assumptions.

Dendrites do reduce information moment-to-moment for a particular neuron. They do not reduce it to a single bit. They also do not just reduce information for the immediate purposes of causing an action potential. Action-potentials are not just 1-bit signals. They're stateful and integrate information differently over time.

Dendrites also serve the important role of signalling any neurons they've received input from, under the right circumstances. Without this, learning wouldn't work at all. Because all your neurons would die, and they have to die when starved of activity so you can stop being a hallucinating newborn and eventually prune your brain down to something that can make sense of the world without crosstalk from every uncorrelated stimulus getting in the way.

The properties that go into a being capable of learning are individually well understood down to the level of ions. Most of what is not understood is how this all works in combination. There are some biological questions, but for a tiny number of neurons of given types, we can predict how they will learn from given stimuli.

There is no algorithm. That's TED-talk level quackery right there. You want to do AI? Stay away from MIT and stay away from computers.

Name: Anonymous 2014-05-27 22:35

>>15
As if you can generalize the capability of human brains. Superior brains can compare an immeasurable amount at once and then intuitively comprehend all human faces while analysing and synthesizing them. That's what superior dendritic trees can do. Dendritic trees don't possess algorithms, computers do. Dendritic trees are dendritic trees, not a computer. You cant equate brains with computer science as brains aren't computers and computation is just a sub-sub-sub-sub-aspect of the physical sub-sub-aspect of reality.

What I say is facts and what you say is nothing more than citation needed. Look at this, where is the paper for your 100k human face assertion? Nowhere to be found. Hashes are just hashes and neurons are neurons. You cant find the mind in either. http://www.google.com/search?channel=fs&q=Dendritic+trees+100%2C000+human+faces&ie=utf-8&oe=utf-8

Name: Anonymous 2014-05-28 0:36

Name: Anonymous 2014-05-28 0:49

>>1
You invoke undefined behaviour, when str points one beyond the 0 char. Consider:

foo = malloc(1);
if(foo) {
*foo = 0;
hash(foo); // bam trap representation, UB invoked inside hash()
free(foo);
}


If you were a C masta you'd also know of the subtlety that this isn't UB when you use hash() with arrays. You're allowed to point one past the last element of an array.

Name: Anonymous 2014-05-28 0:57

>>20
false, you know shit about C
what >>1 did is like

while (*str)
{
hash += *str;
str++;
}

Name: Anonymous 2014-05-28 1:14

>>21
Incorrect, doofus!
Try again, stupid cunt.

Name: Anonymous 2014-05-28 1:30

>>17

Action-potentials are not just 1-bit signals. They're stateful and integrate information differently over time.
Had they changed their working with time, you would have changed your view of the world. For example, the word "nigger" could have suddenly changed its meaning to a white man. So, no - they are pretty much hardcoded after learning is complete. You will never see a nigger as a white man.

Dendrites also serve the important role of signalling any neurons they've received input from, under the right circumstances. Without this, learning wouldn't work at all.
But we are not talking about learning, which is the hardest part.

There is no algorithm. That's TED-talk level quackery right there. You want to do AI? Stay away from MIT and stay away from computers.
Without a general algorithm, you would not have been able to remember general enough structures (i.e. correlate anything to anything with self referencing).

Name: Anonymous 2014-05-28 2:23

>>23
Fuck off with your baseless assertions, minsky.

Name: Anonymous 2014-05-28 2:24

>>18

Average adult human vocabulary is about 30000 words: i.e. when you read a book, you can recognize and analyze relations between 30000 concepts. So I will be surprised if human brain can operate datasets larger than 100000 items, especially because no single neuron can sample them all and as I understand it, most neurons sample only specific portion of the brain: i.e. visual cortex neuron don't directly sample auditory cortex ones.

Name: Anonymous 2014-05-28 2:25

>>24

go suck a nigger cock, integrating him for a woman pussy.

Name: Anonymous 2014-05-28 2:53

>>23
Views of the world change. Niggers cant be seen as white men because they are niggers and not white men and vice versa. Just as an oak tree is not a light bulb and a light bulb is not an oak tree.

Exactly how does the remembrance of general structures of things, correlating of things, and self-reference have to do with algorithms and dendrites?

>>25
Average adult human vocabulary is about 30000 words
Yes and? I'm sure they can improve their vocabulary to more, some may know words of other languages, some may be below average or incredibly above average.

i.e. when you read a book, you can recognize and analyze relations between 30000 concepts.
Does not follow. Words are used to form concepts but concepts are not words and not all words will equate to a concept. A person who has a vocabulary of 30000 words will recognize 30000 words without needing a dictionary for them.

So I will be surprised if human brain can operate datasets larger than 100000 items, especially because no single neuron can sample them all and as I understand it, most neurons sample only specific portion of the brain: i.e. visual cortex neuron don't directly sample auditory cortex ones.
Does not follow and what do datasets http://whatis.techtarget.com/definition/data-set have anything to do with the human brain?

Name: Anonymous 2014-05-28 3:23

>>27

Views of the world change. Niggers cant be seen as white men because they are niggers and not white men and vice versa. Just as an oak tree is not a light bulb and a light bulb is not an oak tree.
Exactly! I love you becoming a racist like the rest of us non-retarded people.

Yes and? I'm sure they can improve their vocabulary to more, some may know words of other languages, some may be below average or incredibly above average.
A word in foreign language would still be related to what you already know. A concept is somewhat more than a word (some auditory or visual representation). Consider monads from Haskell - you know the word, but you don't know a shit about them.

Does not follow and what do datasets http://whatis.techtarget.com/definition/data-set have anything to do with the human brain?
It has to do with the number of relations you can have between the objects at once. I.e. if a dendritic tree references neurons dealing with dogs, cats and birds, then it's neuron correlates all these animals. In the real world you don't have just word "cat", because cats have some form, texture, associated sounds, smells and behavior. So a neuron dealing with a cat would have to sample all the neurons dealing with form, texture and smell of a cat. That is why understanding cats is not just learning a word "cat". Of course all people are different and some kid may draw a cat with wrong anatomy or texture, although most people would feel when anatomy is wrong, even if they can't explain what exactly.

Name: Anonymous 2014-05-28 3:31

>>28
racist
non-retarded
Stupid russian.

Name: Anonymous 2014-05-28 3:53

>>29

Shalom, Hymie!

Name: Anonymous 2014-05-28 11:50

>>22
your mom you fucking shit
tell me how I am wrong

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List