Recently, GPUs (Graphics processing units) have garnered much media attention attention for their newly-tapped ability to process massively parallel data faster than than CPU (central processing unit, your processor). In fact, GPUs are so suited to parellel processing on a massive scale, they are edging in on super computers as a superior way to run scientific modeling simulations. (source)
Already, groups like SETI and Folding@Home have begun harnessing GPUs in the search for aliens and proteins, respectively. Now pathologists and epidemoligists have jumped onto the bandwagon, using GPUs to simulate the introduction of pathogens into complex (human) immune systems as well as the spread of those pathogens in society as a whole.
In other words, every gaming PC just became a medical modeling super computer. Should this newfound power be used for good or for evil? Tell me below!
Anyone ever wonder where I get all the awesome pictures for this blog? Sure you do!
Formerly known as PicLens, Cooliris is the system responsible. Using the plugin (in FireFox 3.0, in my case) I can search image sites like Google or Flickr using an endless 3D wall. You can zoom in or out and fly at high speeds along the length of the wall allowing you to view an enormous amount of content very quickly. Compared to the built-in image search features on most websites, such as Google’s simple page-by-page layout, Cooliris not only increases my productivity many times over, it also looks really cool.
If you haven’t switched over to Google Chrome like I have, and you are still using FireFox 3.0, Internet Explorer 7, or Safari 3.1, I recommend downloading it and trying it out. Visit the site link above and watch the guided tour for more information.
Hopefully Cooliris will release a Chrome version. Chrome is built upon code borrowed from Safari and FireFox, so I don’t think the conversion should be too much work.
Hope everyone had a great previous couple of weeks, I sure did!
Up until now, differentiating between CPU products has been all about getting more cores and higher speeds. If you have the money, for instance, you want to buy a CPU with as many cores as possible running at the highest speed possible in order to get the fastest performance possible. An article by J. Scott Gardener opened my eyes on how much more complicated a smart CPU buying decision will be in the future.
In the past, CPUs were clocked as high as they could go (in terms of GHz ) and priced accordingly. These days, MANY (I will exaggerate a bit and say MOST) CPUs are actually capable of reaching incredible speeds far exceeding their marketed performance grades. For instance, it is possible to take a 1.8GHz Dual-Core Intel processor and overclock it to just about 4.0 GHz. In a simple sense, smart overclocking consumer x just doubled his performance for free. So why doesn’t Intel, AMD, or VIA just sell consumer x the processor already set to 4.0GHz? Because smart overclocking consumer x had a serious heatsink or liquid nitrogen cooling his CPU. Most consumers don’t opt for a mega-large CPU cooling tower or live in Siberia (ambient temperature has a measurable impact on CPUs).
At some point in the future, stating a CPU speed at retail becomes meaningless, because the majority of the produced CPUs can all perform far beyond the cooling capacity of normal cooling solutions. What becomes the differentiating factor at that point? More cores aren’t always more useful if the CPU has to throttle itself to prevent overheating.
Personally, the only answer I can think of is some sort of thermal efficiency measurement. A higher value would mean the CPU could produce more performance for less power and less thermal output. The lower value CPU might produce similar performance, but require more power and a more vigorous cooling solution.
And, as always, this means that consumers get more for their money. Yay!
Oh yes, I am writing this post. In fact, I’m writing it within the subject of the article!
Google recently released Chrome, its own, open-source internet browser. In a market dominated by Internet Explorer 7 (although IE8 is already available in beta 2 form) and FireFox 3, of course Google had to come up with it’s own solution. The real confusion is what open-source solution open-source advocates will rally behind. FireFox has been the primary open-source internet solution for a few years now, with “everyone else” using Internet Explorer (and a subset using Opera or Safari). Adding more confusion, Google has a working contract with Mozilla (they make FireFox) that extends all the way through 2011.
Either way, I’m happy that Google threw their hand into play. This shows that even open-source solutions can benefit from competition. And because everyone has access to the code, the winning modules or solutions can be augmented into the “losing team” anyway. From my understanding, Chrome uses the open-source page renderer webkit (created by Apple) and source code from FireFox itself!
So how is Chrome different than the other guys? For one, they’ve revamped the “home page”. Now, your home page consists of a 3×3 snapshot grid of your most visted websites along with recent favorites and a search bar. The tab system has been massively overhauled, spawning a new “Chrome” process on your computer for each tab. This kind of programming modularity gives Chrome extremely effective memory management and crash resistance. For a more thorough run-down of (fairly technical), I’d recommend reading the Google Chrome Comic, I’ve posted the first page above.
Try it out and tell me what you think!
The site went down! Sorry if anyone was “Forbidden” from entering, I cleared out all the rabid internet lolcats, so everyone should be safe.
Loyal reader Charlie submitted this news. Check out the newfangled motorcycle suit. Very cool, will probably never hit the mainstream, and may be extremely dangerous if a rock were to fly up into your crotch or face. The music is terrible, so just turn of your speakers and enjoy the futuristic design.