Bad google, bad!

Place to talk about all that new hardware and decaying software you have.

Moderator: General Mods

Nach
ZSNES Developer
ZSNES Developer
Posts: 3904
Joined: Tue Jul 27, 2004 10:54 pm
Location: Solar powered park bench
Contact:

Post by Nach »

blackmyst wrote:
Nach wrote:The Google homepage purposely doesn't validate.

One of their goals was to sacrifice every standard, as long as it can run and show properly on every used browser. They did this, because they want as little markup as possible so it loads instantly.

Really? I wish every website loaded instantly like that.
Read: http://blog.outer-court.com/archive/2005-11-17-n52.html

I quote:
Q: In more general terms, what do you think is the relationship between Google and the W3C? Do you think it would be important for Google to e.g. be concerned about valid HTML?

A: I like the W3C a lot; if they didn’t exist, someone would have to invent them. :) People sometimes ask whether Google should boost (or penalize) for valid (or invalid) HTML. There are plenty of clean, perfectly validating sites, but also lots of good information on sloppy, hand-coded pages that don’t validate. Google’s home page doesn’t validate and that’s mostly by design to save precious bytes. Will the world end because Google doesn’t put quotes around color attributes? No, and it makes the page load faster. :) Eric Brewer wrote a page while at Inktomi that claimed 40% of HTML pages had syntax errors. We can’t throw out 40% of the web on the principle that sites should validate; we have to take the web as it is and try to make it useful to searchers, so Google’s index parsing is pretty forgiving.
May 9 2007 - NSRT 3.4, now with lots of hashing and even more accurate information! Go download it.
_____________
Insane Coding
Noxious Ninja
Dark Wind
Posts: 1271
Joined: Thu Jul 29, 2004 8:58 pm
Location: Texas
Contact:

Post by Noxious Ninja »

Only 40% of pages have errors? I think that's rather optimistic.
[u][url=http://bash.org/?577451]#577451[/url][/u]
Nach
ZSNES Developer
ZSNES Developer
Posts: 3904
Joined: Tue Jul 27, 2004 10:54 pm
Location: Solar powered park bench
Contact:

Post by Nach »

Noxious Ninja wrote:Only 40% of pages have errors? I think that's rather optimistic.
No, only 40% have syntax errors, I'm sure much more than that have all kinds of errors.
May 9 2007 - NSRT 3.4, now with lots of hashing and even more accurate information! Go download it.
_____________
Insane Coding
Nightcrawler
Romhacking God
Posts: 922
Joined: Wed Jul 28, 2004 11:27 pm
Contact:

Post by Nightcrawler »

I really don't think it takes much time to be standards compliant with your web code. This big tirade over wasting time over something nobody notices is unfounded.

On all the sites I have developed, I have rarely had to spend any extra time on standards compliance. A competent web developer will inherently do it while programming.

It's not like XHTML is such a big construct to follow anyway. Close the tags you open in order, use quotes etc. Big deal. Why aren't you aiming to do that anyway? How the hell can you rely on the browser to close all your tags where you meant to close them? not all browsers correct bad code the same way.

And you know standard compliant code is going to work on all standard compliant browsers. And if it doesn't today, it will when they fix whatever issue it is. Bad code will be bad code forever with no guarantee of how long it will work.

About the only real headache when it comes to standard compliance is presentation removal to CSS. Getting CSS to work on all major browsers can be a chore that may not be worth it sometimes, but that's about it.

For anything else, I just don't see much excuse.
[url=http://transcorp.romhacking.net]TransCorp[/url] - Home of the Dual Orb 2, Cho Mahou Tairyku Wozz, and Emerald Dragon SFC/SNES translations.
[url=http://www.romhacking.net]ROMhacking.net[/url] - The central hub of the ROM hacking community.
sweener2001
Inmate
Posts: 1751
Joined: Mon Dec 06, 2004 7:47 am
Location: WA

Post by sweener2001 »

insta-loading
[img]http://i26.photobucket.com/albums/c128/sweener2001/StewieSIGPIC.png[/img]
Starman Ghost
Trooper
Posts: 535
Joined: Wed Jul 28, 2004 3:26 am

Post by Starman Ghost »

sweener2001 wrote:insta-loading
I don't see how an extra 1kb worth of closing tags will make a page load any faster unless you're using like a 1200 baud modem from the stone age.
[code]<Guo_Si> Hey, you know what sucks?
<TheXPhial> vaccuums
<Guo_Si> Hey, you know what sucks in a metaphorical sense?
<TheXPhial> black holes
<Guo_Si> Hey, you know what just isn't cool?
<TheXPhial> lava?[/code]
creaothceann
Seen it all
Posts: 2302
Joined: Mon Jan 03, 2005 5:04 pm
Location: Germany
Contact:

Post by creaothceann »

Yeah. I don't buy that argument - it's just an easy way for Google to reduce traffic.
vSNES | Delphi 10 BPLs
bsnes launcher with recent files list
Deathlike2
ZSNES Developer
ZSNES Developer
Posts: 6747
Joined: Tue Dec 28, 2004 6:47 am

Post by Deathlike2 »

creaothceann wrote:Yeah. I don't buy that argument - it's just an easy way for Google to reduce traffic.
Bandwidth people bandwidth! All those shaved bytes all add up into cost savings! :wink:
Continuing [url=http://slickproductions.org/forum/index.php?board=13.0]FF4[/url] Research...
sweener2001
Inmate
Posts: 1751
Joined: Mon Dec 06, 2004 7:47 am
Location: WA

Post by sweener2001 »

sweener2001 wrote:insta-loading
[img]http://i26.photobucket.com/albums/c128/sweener2001/StewieSIGPIC.png[/img]
Noxious Ninja
Dark Wind
Posts: 1271
Joined: Thu Jul 29, 2004 8:58 pm
Location: Texas
Contact:

Post by Noxious Ninja »

Deathlike2 wrote:
creaothceann wrote:Yeah. I don't buy that argument - it's just an easy way for Google to reduce traffic.
Bandwidth people bandwidth! All those shaved bytes all add up into cost savings! :wink:
Well, in Google's case, I suppose saving 1KB per hit might be a valid excuse.
[u][url=http://bash.org/?577451]#577451[/url][/u]
Nightcrawler
Romhacking God
Posts: 922
Joined: Wed Jul 28, 2004 11:27 pm
Contact:

Post by Nightcrawler »

Noxious Ninja wrote:
Deathlike2 wrote:
creaothceann wrote:Yeah. I don't buy that argument - it's just an easy way for Google to reduce traffic.
Bandwidth people bandwidth! All those shaved bytes all add up into cost savings! :wink:
Well, in Google's case, I suppose saving 1KB per hit might be a valid excuse.
Maybe, but browsers support this thing known as gzipped content. So.. unless you REALLY cut out a lot, the end result won't be saving you much in size at all. Web page source compresses pretty well.
[url=http://transcorp.romhacking.net]TransCorp[/url] - Home of the Dual Orb 2, Cho Mahou Tairyku Wozz, and Emerald Dragon SFC/SNES translations.
[url=http://www.romhacking.net]ROMhacking.net[/url] - The central hub of the ROM hacking community.
creaothceann
Seen it all
Posts: 2302
Joined: Mon Jan 03, 2005 5:04 pm
Location: Germany
Contact:

Post by creaothceann »

vSNES | Delphi 10 BPLs
bsnes launcher with recent files list
Deathlike2
ZSNES Developer
ZSNES Developer
Posts: 6747
Joined: Tue Dec 28, 2004 6:47 am

Post by Deathlike2 »

This is surprising? Seriously, if web design ENFORCED any sort of standards via the browser level, we wouldn't be in this mess. Then again, Javascript is a whipping post of poorly executed ideas.
Continuing [url=http://slickproductions.org/forum/index.php?board=13.0]FF4[/url] Research...
Post Reply