Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Google's Eric Schmidt on What the Web Will Look Like in 5 Years (readwriteweb.com)
52 points by nreece on Oct 28, 2009 | hide | past | favorite | 39 comments


It would be nice to see what he means with 'Chinese content will dominate the web', does he mean that in terms of importance or in terms of volume (or both) ?


If Chinese starts to dominate, I'd expect to see great strides in automated translation research here in America. Us Americans are freaked out by the prospect of a world that doesn't speak English.


Freaked out?

Perhaps the translation services will arise out of our desire to communicate and converse with more people, not out of fear. It's an opportunity, not a threat.

As a side note, the desire to think that others act from fear really oversimplifies a lot of social behavior.


Good point, I do not either what he meant. I think importance would be difficult to measure, but volume growth would be easy for Google to do. I imagine that Schmidt is not talking blindly, he has access to the growth rates of content and the forecast may be accurate.


Perhaps there was a good chance that a lot of Chinese business leaders were at that conference, so he is just tailoring his remarks. Without context, you can't really extract meaning.


This statements makes no sense to me either.

I'm not going to learn chinese; you neither, for most values of you. So we'd still have our web for a small group of maybe one billion english-speaking (FSVO) users. Maybe there would be more content in Chinese, but we will never know because we won't bother to read it unless they translate, to ENGLISH.


If Schmidt is expecting machine translation to significantly improve, it makes perfect sense.

You wouldn't need to learn Chinese for it to impact you. You'd just start getting more and more readably-chinglish results as you search. And if translation services work well-enough each way, it would have an unavoidable impact on products/services angling toward Chinese interests.


Very interesting. The really smart part was the bit about ranking real-time content.

If anyone here is working on a real-time tweet-ranking algorithm, good luck. Google's going to be there too.

It would be very useful if search.twitter.com had a mode which returns tweets ranked not just by posting time, but also by the quality of the tweeter, and the quality of the specific tweet (may be gathered from RTs, etc).


All the top websites today were just getting their start five years ago. Therefore, it follows that all of the sites that will dominate five years from now are probably already around in their infancy today. So what are they?

I'm guessing Twitter, Scribd, FourSquare, Justin.tv, and Hulu. I feel like we'll have a new social news site that takes out both Digg and Reddit, but I don't yet know what that is.


Tom Foremski had an interesting perspective "GOOG CEO Predicts A Predictable Future Web - Stunning Absence Of Any Real Insights" http://www.siliconvalleywatcher.com/mt/archives/2009/10/goog...


I concur with this editorial. I could have come up with the statements that Mr. Schmidt listed here.


Anyone else think it's weird that he wouldn't even mention twitter and facebook? he just called them "the two fine companies". I know they want to own the information and distribute it (that's their biz). But why not talk about them directly?


The idea that ranking real-time content is "the great challenge of the age" strikes me as hyperbolic.

Real-time content is great for some key things -- breaking news, planning a trip or outing, attention-addled entertainment -- but a distraction for most other productive activities. It deserves an arena -- but doesn't need to be mixed with my searches for reference information.

So the "great challenge" might be answered rather simply by: give real-time its own silo. Filter non-credible sources. Suppress duplicates. Forward 1-2 results to general search for applicable queries.


I think the Chinese dominating the web is going to happen, but it won't disrupt non-chinese speaking content consumption and associated services. Japan is wired and almost all 8-50 yr olds use technology actively and infuse a large amount of content (mixi.jp anyone?) into the web - dominating Japanese content streams, search and web services. However, to the non-Japanese speaking/reading world they only see the occasional odd image, news, etc. Silos are a nice analogy.


"Five years is a factor of ten in Moore's Law, meaning that computers will be capable of far more by that time than they are today."

Remember PCs five years ago?

Exactly, not much changed except for multicore, and it's still underused. I'd not expect an average PC in five years to be anything unusual.

Mobile devices, on other hand, have a lot of way to grow. Maybe we'd see phones with HD screens and 1GB RAM in five years, and those would run whatever you'd wish.


Computers are more than just it's CPU's # of Ghz's.

A top PC from 2004 might look like: CPU: Intel Pentium 4 3.0 C @ 3.24 Ghz - Northwood Core 512MB PC3200 DDR SDRAM (400 Mhz) 160 GB Harddrive ATA-100 GeForce 6800 17" CRT Monitor

Now we would expect 8 times as much RAM and disk space. Far faster CPU's with for cores and a huge L2 cache. 4 - 8x as much RAM on a graphics card with 2700 GFLOPs vs 54 GFLOPs of processing power. And our monitors are both larger, lighter, and have a higher resolution. But, what's most interesting is we are far less CPU bound in 2009 than we where in 2004.

PS: A single i7 core would crush a Northwood Pentium 4 clock per clock even limiting it's self to the same instruction set. An 3.4GHz i7 has more floating point performance than a 2004 era graphics card (69 vs 54).

Note: I am ignoring the Prescott as it took a Prescott core overclocked to 5.2 GHz to soundly beat the performance of a 64-bit Athlon FX-55 that clocked at 2.6 GHz. http://en.wikipedia.org/wiki/Pentium_4


The interesting thing to notice here, is that the biggest impact of all that improvement in hardware, when you look at it in a web context actually goes towards increased execution speed of javascript.

Most of the other advantages only come out when playing computer games or something like that.


I disagree. The biggest change over the last 5 years has to be that we're designing web content for mobile devices in a very serious way.

And that is due to many of the 'moore's law' advances that have relatively little to do with javascript execution. (Small-screen quality, cpu/gpu/memory speed, improved communications chips and infrastructure, etc)


In 2004 I was using a 2048x1280 display at 112 ppi. Today I can't find a screen over 100 ppi. So in pixel density we have actually gone backwards, except for laptops.


hi ppi IS useful. you don't need antialiasing software and such crap if you have hi ppi. personally, I would be able to find use for a wqxga (2560x1600) in a 14" or 15" laptop screen. And no, font size does not become tiny if one does not want this to be (although I want tiny fonts), with the new desktop technologies like microsoft aero or linux counterparts like plasma and compiz.

http://en.wikipedia.org/wiki/WUXGA

why we are going backwards: http://www.nytimes.com/2008/11/13/technology/13iht-13panel.1...


While pixel density obviously isn't making leaps and bounds, Apple's products are pretty consistently over 100 ppi (not to mention the iphone at 160 ppi):

27-inch new iMac display (2560x1440) - 109 ppi

21.5-inch new iMac display (1920x1080) - 102 ppi

15-inch MacBook Pro (1440x900) - 113 ppi

30-inch Cinema Display (2560x1600) - 101 ppi


My 2 year old 15 inch ThinkPad T61p is 1900 x 1200 which is 150ppi. And honestly I don't really want a higher ppi. Software that does not let you change the default fount size becomes almost unreadable. It all comes down to viewing distance and the further you sit from the screen the less useful high PPI becomes.


For me, the same observation holds true for bandwidth. I have had the same bandwidth for a couple of years and see just no reason to upgrade. A 100Mbit (assumed though not explicitly mentioned in the article) line seems just out of reach for now, possibly upgrades to 10Mbits but even that would surprise me very much...


There are already phones coming out and planned with 800x480 and higher res screens, 1Ghz processors and upwards of 256Mb.

Moving up to 720p HD, 1Gb of memory and even quicker/more-cores chips doesn't seem more than a couple of years off. Whether they'll manage to make them any more usable is a different question :)


But the real leverage is in the servers or the grid. Ok, the cloud.


1. no netbooks

2. no iPhones


iPhones and other smartphones. I just bought a 16GB USB stick for the equivalent of $95 (and I live in an expensive country). What was the cost of that kind of (portable) storage space in 2004?


Google powered brain augmentations streaming text-based ads directly to your nightmares?

Sorry had to be said :)

Moore's Law still holds. Except we realized that doing what we classically did was not working out. Making CPUs faster causes heat problems and those are getting very problematic (remember when computers only needed a heat sink?) So instead of building a faster car, we just build more lanes on the road, so everyone can move quickly. Remember a 2nd core is almost double the capabilities, we just need software to catch up :P


Shouldn't the title of their article more accurately be:

  What Google will MAKE the Web Look Like in 5 Years
(seriously, who's going to change any path they decide to take?!)


Google are not beyond reproach. If a Google service drops the ball and people have an alternative they'll move on.


They're already dropping the ball, especially when it comes to search quality. There is a marked decrease in quality of results in the last couple of months.


People have been saying that exact thing for years. Didn't seem to have much impact..


I think the jury's out on Google wave.


Maybe not in 5, but in 15 years, there may very well be no Web.


It's interesting that you got modded down that ruthlessly, for one I'm quite interested what you have to say, secondly that is not unthinkable at all. It didn't take long for the web to replace the majority of information retrieval systems that we had up to then, and that's roughly 15 years ago.

Now of course we do have the web, so the bar is a lot higher but I don't think it is an impossibility that a successor will arise.

Hopefully that one will be able to get past some of the mistakes of the current one. A viable built in micropayment system and you just might have every news paper on the planet pushing your new service.


Why do you think so? What do you suppose will replace it?


I could tell you, but then I'd have to kill you... ;)


Well, if it's google I think the answer will likely be 'inaccessible'.


Has Google been having massive server downtime issues that I'm not aware of?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: