Back from the Dead! (Part 2)

Another long hiatus, another long post depression. It looks like the World Economies got wind of my decrease in posting and suddenly we are spiraling toward a worldwide depression! 
Meanwhile, one of my favorite processor companies gets into hot water and splits into two

And now I learn that even my laptop will inevitably fall victim to a faulty GPU, exploding into flames and taking out my desk along with it! Oh how will I play Crysis Warhead now!?! Good thing the repair should be free.

Now guess what, due to a recently discovered vulernability in adobe flash, all browsers are capable of being “clickjacked” by nefarious persons. What does this mean? Click on the wrong link, and your microphone and webcam were just secretly activated by some creepy dudes in Eastern Europe. A fix is in the works, but until then, get used to the idea of being watched. 

Gamers and Medical Research

Recently, GPUs (Graphics processing units) have garnered much media attention attention for their newly-tapped ability to process massively parallel data faster than than CPU (central processing unit, your processor). In fact, GPUs are so suited to parellel processing on a massive scale, they are edging in on super computers as a superior way to run scientific modeling simulations. (source)
Already, groups like SETI and Folding@Home have begun harnessing GPUs in the search for aliens and proteins, respectively. Now pathologists and epidemoligists have jumped onto the bandwagon, using GPUs to simulate the introduction of pathogens into complex (human) immune systems as well as the spread of those pathogens in society as a whole. 

In other words, every gaming PC just became a medical modeling super computer. Should this newfound power be used for good or for evil? Tell me below!

Welcome Back!

Hey there yall. Been quite the vacation from posting on the good old blog here, so I thought I’d return to you in a very meta-fashion.
Newsbits:

The ATI 4870X2 ($550) kicks the butt of every other “single” GPU available. Each card is outfitted with 2GB of GDDR5 RAM. It is actually possible to combine two of these cards in Crossfire X for a total of 4GPUs and 4GB of GDDR5 RAM. 64-bit Vista is an absolute requirement in this case, otherwise you’ll be running your monster gaming system with virtually no usable system ram. Woo for playing Crysis on a 30 inch computer monitor at 2560×1600 resolution! (link) Great, but I can’t wait until Fusion, a processor in which AMD will be slapping at least 2 CPU cores and a next-generation (5000 series) GPU core. Can anyone say, “the death of integrated graphics?”

Intel has finally released the USB3.0 specification. We are talking a 10x increase in transfer speed over USB2.0. Cool…I guess. But with eSATA already punching up transfer speeds as high as internal SATA, who needs the extra speed for anything besides a USB key? It’s not going to make my mouse any faster, that’s for sure. (link)

I just saw Tropic Thunder in theaters and was pleasantly surprised! Ben Stiller grabbed Jack Black and Robert Downy Jr. as well as the curiously unmentioned Tom Cruise and Matthew McConaughey and made an actually worthwhile comedy! Now I can forget about The Heartbreak Kid! *gag*

A new company has E. Coli crapping out Diesel. (link) It works like Cellulosic Ethanol (organic matter –> product). By producing Diesel instead of Ethanol, existing infrastructure is already capable of transporting and selling it. Ethanol requires a slightly modified engine and more expensive oil pipelines because it is more corrosive than normal gasoline. The start-up company responsible says with a few genetic modifications the E. Coli can also produce normal gasoline and even jet fuel! Very cool, but until this process is scaled up thousands and thousands of times, it isn’t much more than a proof of concept. The E. Coli used is allegedly harmless though, can anyone say home-made diesel?

I can’t remember if I already posted this, but check out my favorite site on CFLs and LEDs, seriously, click the link. When you get bored of the cave, pull on the lever. With recent LED breakthroughs, hopefully we can just forget about the CFLs and transition completely to LEDs. Of course, it’ll take at least 2 years (it always does).

Space Siege was just released, on Steam even! Too bad early reviews (including a review by my favorite video game editor, Jeff Green) call it absolutely average. I’ll still be picking up the game (probably on Steam), but I’ll wait for the 20 dollar price drop in a few months.

And finally, wonder where I was? (click for full resolution image)

Canada Coast

Nvidia GTX 280/260 Benchmarks Released!

In advance of the launch tomorrow, any tech site worth its salt has released a hands-on review of Nvidia’s next line of GPUs. So far the nomenclature of the new series completely resets Nvidia’s old system. Previously, the last Nvidia line of GPUs ended (presumably) at the 9800GX2. Now they are back in the hundreds and have placed a GTX prefix in front of the entire line (so far).
Ok, on to the random bits of information. This beast has 1.4 BILLION transistors. Nvidia’s last high-end card had only 754 million transistors, so we are seeing almost double the brute force capability. Let’s show you a little comparison between a Geforce GTX280 and a top of the line dual core Penryn CPU from Intel:

GTX 280 Die Size

Now technically, in a simplified sense, the older generation of Nvidia GPUs had 128 cores. Consumer CPUs, at most, have 4 cores. The new generation of Nvidia GPUs has 240 cores. That is an INSANE number of cores compared to a Central Processing Unit. Maybe that’s why the GTX 280 can suck 236 Watts at full bore. Now you might wonder, why not replace or augment your CPU with all of that crazy power in your GPU? Well, the industry is actually moving in that direction. The primary roadblock is the fact that GPUs process data in a very specific, specialized way and CPUs are built to process data in a very general way. GPUs are generally built to work with pixels, while CPUs are built to work with ANY data. We’ve already seen GPUs used by scientists to do brute force calculations much faster than CPUs, and we’ll see a more mainstream/consumer fusion of the two components in late 2009.

So how much faster is it?! Compared to the previous single-card solution from Nvidia, the 9800GTX, it is roughly 50% faster in most games. Running the game Oblivion, here is a benchmark graph stolen from AnandTech.com comparing the top performing GPUs at maximum resolution and maximum detail. This resolution is so high that it is confined to 30 inch monitors, most users will not be pushing their GPU nearly this hard. Score is presented as number of rendered ‘frames’ per second.

GTX 280 Benchmarks

The GTX280 will cost $650 and the GTX260 will cost $400.

GTX 280

Nvidia has done it again, the fastest single GPU in existence. AMD/ATI have their work cut out for them, we already know their 4870 wont be as fast as the GTX280, but theoretically the pricing should be MUCH lower and multi-GPU solutions could end up giving them a competitive advantage performance-wise. We’re in the midst of another GPU war! It’ll be a few more days till we get to see direct comparisons between the best from AMD and the best from Nvidia.

I. Can’t. Wait.