Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Why browsers are bloated

Name: Anonymous 2014-07-27 0:20

https://github.com/WebKit/webkit/blob/master/Source/WebCore/platform/Scrollbar.cpp
https://github.com/WebKit/webkit/blob/master/Source/WebCore/platform/win/ScrollbarThemeWin.cpp
Let's reinvent the fucking scrollbar, which every goddamn platform with a UI already has, and make it behave subtly different from the native one!

Right-click a native scrollbar in some other app:
- Scroll Here
- Top
- Bottom
- Page Up
- Page Down
- Scroll Up
- Scroll Down

Right-click a scrollbar in Chrome:
- Back
- Forward
- Reload
- Save As...
...

Right-click a scrollbar in Firefox and Opera:
Absolutely fucking nothing happens!

What the fuck!? How did these terminally retarded idiots get involved in creating one of the most important pieces of software to the average user?

Name: Anonymous 2014-11-24 23:33

I always knew type systems would turn out to be snake oil.

Let's go back to the days of BLISS and B, where everything is a word.

Name: Anonymous 2014-11-24 23:56

>>361
Having a new processor where byte = word and word >= 48bit would be nice.

Name: Anonymous 2014-11-25 0:03

>>362
You want 8 to be >= 48?

Name: Anonymous 2014-11-25 0:13

>>363
Byte ≠ 8 bits in every architecture. You are thinking of octet.

Name: Anonymous 2014-11-25 0:30

>>354
Fucking annoying stupid undergrad.

Name: Anonymous 2014-11-25 0:35

>>358
What does any of that shit have to do with 1s and 0s?

Name: Anonymous 2014-11-25 0:38

Name: Anonymous 2014-11-25 0:45

>>354
Incredible still means unbelievable. Unbelievable still means unbelievable.

You're confusing the figurative connotation of a word with its direct opposite. Something can be literally unbelievable
(space niggers shitting dicks on the moon)
or figuratively unbelievable
(you're so dumb/hot it's hard to believe, but it's not like I don't trust my personal judgment because I don't mean it in a literal sense).

Something can be literally terrible
(Nikita's mom inspires fear even amongst Putin's cabinet)
or figuratively terrible
(This quality I'm judging right now is so deep it almost scares me, but it's nothing serious so I have nothing to be afraid about)

Are you implying the current use of "literally" is somehow a figurative use of the original word? How can something be figuratively literal anyway?

Name: Anonymous 2014-11-25 0:47

>>367
I remember a thread about him!

Name: Anonymous 2014-11-25 0:50

For you fags who praise the niggers' idiocy and call it ``language evolution'', what word is one supposed to use now if one needs to convey the concept of wanting a phrase not to be interpreted figuratively?
Huh?!

Name: Anonymous 2014-11-25 1:01

>>370
You can still use literally. Just because a word gains a new meaning doesn't mean the old meaning magically disappears, unless the old meaning becomes replaced by other words (like "dumb" and "gay", for example.) Given that literally is used in academic contexts, that's not likely to happen.

Also, since we're talking about language evolution, I'd like to point out that you used the phrase "wanting" in the context of "to desire", when that word originally meant "lacking." If the Shakespearian age prescriptivists were here today, they'd shun you for using "want" when you meant "will" or "wish."

Name: Anonymous 2014-11-25 1:16

Typical academia nihilists and their worship of the inferior masses (niggers and their skittles) and their non-progress. Nigger value of arbitrary change. Commie retards. muh gender queer power studies fuckin evil prascraptavists muh relativity muh subjectivism oy vey evil theists and their objective values approach! get out the nigger dicks!

Name: Anonymous 2014-11-25 1:18

logic is arbitrary
these computers and their processors are magic and we can do anything with them
typetheorists.jpg

Name: Professor Shlomo Goldberg 2014-11-25 1:19

>>371
prescriptivists
Good goy.

Name: Anonymous 2014-11-25 1:21

steven ``kike'' pinker is a jew
WHO WOULD OF THOUGHT
Sexual selection confirmed to be a jewish invetion.
implying peacocks have anything to do with humans
implying high percent of gene relation with chimps (niggers?) doesn't actually end up proving you wrong and showing that genes have nothing to do with human psychology

Name: Anonymous 2014-11-25 1:24

high level languages are a jewish invention

Name: Anonymous 2014-11-25 1:29

>>376
Eigo is ultimately based on Jewgo

Name: Anonymous 2014-11-25 4:58

>>327
1. Can be fixed with just the evaluations optimized: Flags
2. Only if really needed on the type of OS, like RTOS and embedded devices. What this really needs is the list of dependencies.
3. Can be removed. Usually not recommended for validations. On a Mass distribution, required (MDr). Personal binaries (PB), no .
4. Same as 2.. Can be removed. MDr, PB up to you.
5. Can be removed. Better system uses version with hash of "update." E.g.: version control systems, delta journal filesystem, GPG signatures, etc..
6. Required, if not binary fails. But with it you can replace many of the above issues.
7. Read 6.
8. Read 2.

Most of these are a personal user case scenarios versus mass distribution, where you HAVE TO notate these things. You can, if you want, tailor your own modified distribution of some base, like the multiple debian clones, and tailor it with the above solutions, for personal use.

At the end of the day, Jailing your applications with access controls is all you really need to do. X binary does not need to know or touch Y.common.dependencies, which they both share. X only knows the locations of a "common" directory where it can find X.dependencies. But Y Binary on it's own jail cannot, under any circumstance, know X binary called common.shared.dependencies. Only the shared.common.directory daemon knows X and Y asked for it, and it tells the AC deamon such, logs it and hashes it.

IoW: Stop using defaults, you silly goose. But thank you for insight. Never thought of this, until now.

Name: Anonymous 2014-11-25 5:54

>>368
Yes, something incredible is by extention unbelievable, but nobody says "I don't believe you. You're being incredible." Likewise, nobody says "I'm so scared because that was terrible!" The two words have evolved different meanings. Incredible has come to be synonymous with amazing. Terribe has come to be synonymous with bad. If I translated "nomen dei terribilis est" as "God's name is terrible", I'm sure there would be plenty of christians offended by that usage.

Furthermore, as I pointed out on the /lounge/ discussion of this same topic, the word literally itself is a figurative usage. The word literally means "pertaining to letters." (From the Latin 'litteralis", "of letters", from "littera", "letter") Its usage outside of letters is by extension, and therefore, figurative. I also pointed out that etymology does not dictate meaning, but it is nonetheless an interesting note to add, especially since you suggested that word literally can't be used figuratively.

Name: Anonymous 2014-11-25 8:30

Terrible! *pisses pants*

Name: Anonymous 2014-11-25 20:53

Did tdavis just join this thread?

Name: Anonymous 2014-11-26 14:53

>>365
Stop being so uneducated and stupid please. Just accept the facts.

Name: Anonymous 2014-11-26 15:51

>>382
Postmodernist.

Name: Anonymous 2014-11-26 18:45

>>379
etymology does not dictate meaning
Well of course it doesn't. Meaning dictates meaning. When you use words wrong, you're just spouting nonsense.

Name: Anonymous 2014-11-27 4:29

>>384
What was your point in agreeing with me?

Name: Cudder !MhMRSATORI 2014-11-27 15:25

>>327
Try determining those 8 things you listed from this binary (base 64'd):

TVogAAEAAAACAAMAUEUAAEwBAQAAAAAAAAAAAAAAAADgAA8BCwEAAAACAAAAAAAAAAAAADQQAAAA
EAAADAAAAAAAQAAAEAAAAAIAAAQAAAAAAAAABAAAAAAAAACqEAAAAAIAAAAAAAACAAAAAAAQAAAQ
AAAAABAAABAAAAAAAAAQAAAAAAAAAAAAAABgEAAAKAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
ABAAAAgAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAudGV4dAAAAKoAAAAAEAAArAAAAAACAAAA
AAAAAAAAAAAAAAAgAADgAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAACQ
EAAAAAAAAEJ5ZSB3b3JsZCEAAEJ5ZQBIZWxsbyB3b3JsZCEAAAAARmlyc3QgcHJvZwAAVos1ABBA
AGoAaCgQQABoGBBAAGoA/9ZqAGgUEEAAaAgQQABqAP/WXsPMzMyIEAAAAAAAAAAAAACeEAAAABAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAkBAAAAAAAAC+AU1lc3NhZ2VCb3hBAFVTRVIzMi5kbGwAAAAA

Name: Anonymous 2014-11-27 19:47

>>327,386
Boy, this will be interesting.

Name: Anonymous 2014-11-27 20:22

>>378
1. Can be fixed with just the evaluations optimized: Flags
What do you mean? Asking the compiler to optimise the evaluations?

5. Can be removed. Better system uses version with hash of "update." E.g.: version control systems, delta journal filesystem, GPG signatures, etc..
Still in binary if you made use of __DATE__ and __TIME__.

8. Read 2.
They are in my binary when I compile with default gcc without using a RTOS.

4. Same as 2..
You mean same as 3

Since most people use the default options or some popular options, using most of the flags will make NSA know what binaries belong to a specific author. This is the biggest problem. If NSA sees the "GCC (Debian 4.2)" missing they will know it's YOU and they will be able to find other binaries made by you.

There are many other stuff that may un-anonymise you, for example if you use only fgetc instead of getc and getchar, they will try to connect it with other binaries that do that.

>>386
How should I know?

Name: Anonymous 2014-11-27 21:39

>>386
If that was true, DRM wouldn't exist. You would be able to create a unique encoding for every media file and put it up for download in a bundle with its unique decoder. Then the pirate and downloader would be able to claim that since it's just an random bytestream (i.e. not a media file in any widely known codec), it's not a media file and doesn't violate any copyrights.

But that doesn't happen. So you're wrong.

Name: Anonymous 2014-11-27 23:11

>>388
He can just put in the most widely used flags regardless of what actual options he used.
/thread

Name: Anonymous 2014-11-27 23:19

>>390
It still has problems.

Name: Anonymous 2014-11-27 23:21

>>388
1. instead of having the compiler name in an executable file formats, just have compiled so it does not have any of that info. Making sure it is "generically" optimized.
5. Nope, you can remove __DATE__ and __TIME__ entirely, and just output a revision hash.
8.
They are in my binary when I compile with default gcc without using a RTOS.
Default, keyword there. Why I said you need to recompile using flags for more anonymity about it's source origin. Symbols can technically be done away with on any binary. Most flags and dependency calls can be anonymized too, by just the calling required hash of said Symbols and dependencies.

Even then, you are only identifying the binary with it, not the owner of the OS.
4. No, 2. The OS and distribution are always targeted for an architecture, always. If the binary has an OPcode from another arch it will horribly fail, even if you don't label the binary at all. You don;t even need to mark the OS and distribution, the binary will fail if it it's not running in the proper OS with it's dependencies. No need to mark the binary

Since most people use the default options or some popular options,
We are no most people, now, are we?

using most of the flags will make NSA know what binaries belong to a specific author.
Actually, if everyone is using the defaults to everything, it will make it worse for any intruder to distinguish it from another person that ran with the defaults. NSA included.

If NSA sees the "GCC (Debian 4.2)" missing they will know it's YOU and they will be able to find other binaries made by you.
And, that's the point of spreading a distribution, so that more than one person runs the same/similar system, making it difficult to distinguish it from others following your example. Why do you think people sharing files has done? Make one user more identifiable than the next by doing the same another has done?

if you use only fgetc instead of getc and getchar, they will try to connect it with other binaries that do that.
That's if you let the system poll the binaries in your system, like an improperly jailed system is (like those not jailed at all). Sure, being having a ``different'' scheme of binaries than the norm identifies you, but so does the simple MAC address or serial numbers of your device, which is the real threat to anonymity.

Also, try Cudder's Challenge: >>386
Simple raw binary, I assume assembled, not compiled.

Name: Cudder !MhMRSATORI 2014-11-28 15:37

>>392
You are already wrong.

Didn't realise it was 386GET!

Name: Anonymous 2014-11-28 17:19

>>393
Shalom!

Name: Anonymous 2014-11-29 22:11

Name: Anonymous 2014-11-30 9:01

>>395
LOL
Argument type Return type
C++ (since 1998), Java (since J2SE 5.0), Scala, D Invariant Covariant
C# Invariant Invariant
Sather Contravariant Covariant
Eiffel Covariant Covariant


Not one of the mainstream languages gets it right. Only Sather which I haven't even heard about prior to readint this article. All corporate languages are shit.

Name: Cudder !MhMRSATORI 2014-11-30 16:29

>>395,396
WTF is this OOP bullshit?

Name: Anonymous 2014-11-30 16:59

>>397
Jewish shit. Just like you!

Name: Anonymous 2014-11-30 21:11

KHTML is the promised land.

Name: Anonymous 2014-11-30 21:30

>>399
KHTML is cut from the same exact cloth as all the other garbage. It goes all the way back to the overwrought compiler architectures used to build them. People these days don't understand the conceptual differences between a preprocessor, a compiler, a linker, and an assembler. These are separate phases and they need to be written as separate programs, and the language designer needs to refrain from doing the work of one phase while performing another.

Newer Posts