Gateway + Bestbuy = Affordable Mobile Gaming

While few notebooks can even attempt to reach the sheer performance level of an equivalent desktop,  gamers are nonetheless flocking to mobile systems to fulfill their needs. The gaming PC market still leans heavily toward desktops (a respectable gaming desktop starts at about $700 with a screen), but gaming notebooks are coming up fast as mobile profit margins grow at an astounding pace. Prices for gaming laptops (considering games like Crysis, Supreme Commander, Mass Effect, or Gears of War) hover around the two thousand dollar mark, edging easily into the three and four thousand arena with a few upgrades. Yeah, I’m looking at you, sexy Alienware 17x.
Gateway (out of nowhere) has recently introduced respectable, some might even say affordable, gaming laptops in partnership with Bestbuy.

Disregarding their previous entry (with BestBuy), this new system (dubbed the P-7811FX) includes some hefty specifications for a laptop coming in at $1399. (BestBuy’s site recently increased the price 50 dollars, I’d call that serious demand)

Gateway Gaming Laptop

* 17-inch WUXGA (1920×1200) display
* Intel Core 2 Duo P8400 (2.26GHz/ 1066MHz FSB/ 3MB L2) processor
* Nvidia GeForce 9800M-GTS graphics card with 512MB GDDR3 memory
* 4GB DDR3-800 RAM
* 200GB 7200RPM SATA hard drive
* Windows Vista Home Premium 64-bit
* HDMI port
* Weight starting at 9.2 lbs

(link to proof) (actual review posted after I wrote this article)

Just looking at the numbers, this machine is a BEAST. An equivalent PC with desktop versions of the same parts would undoubtedly be at least 30% faster at a somewhat lower price, but that doesn’t really pull too much wind out of this system’s sails.

While many of my notebook loving brethren appreciate the extremely high resolution on this 17 inch screen (1920×1200), it is far too pixel dense for my (eyes) tastes. Still, with a 9800M GTS, the system is actually capable of running games at that native resolution with respectable frame rates.

So what is the takeaway? A desktop is still the way to go if you want to go all out in performance. A mobile system will never last quite as long (it takes much more abuse) and is still more expensive. Still, Gateway (and BestBuy) have done a great job bridging that gap. If this overall trend continues, and it certainly looks like it will, laptops will continue to drop in price while increasing performance.

Now if only I could get it in blue and green…

Welcome Back!

Hey there yall. Been quite the vacation from posting on the good old blog here, so I thought I’d return to you in a very meta-fashion.

The ATI 4870X2 ($550) kicks the butt of every other “single” GPU available. Each card is outfitted with 2GB of GDDR5 RAM. It is actually possible to combine two of these cards in Crossfire X for a total of 4GPUs and 4GB of GDDR5 RAM. 64-bit Vista is an absolute requirement in this case, otherwise you’ll be running your monster gaming system with virtually no usable system ram. Woo for playing Crysis on a 30 inch computer monitor at 2560×1600 resolution! (link) Great, but I can’t wait until Fusion, a processor in which AMD will be slapping at least 2 CPU cores and a next-generation (5000 series) GPU core. Can anyone say, “the death of integrated graphics?”

Intel has finally released the USB3.0 specification. We are talking a 10x increase in transfer speed over USB2.0. Cool…I guess. But with eSATA already punching up transfer speeds as high as internal SATA, who needs the extra speed for anything besides a USB key? It’s not going to make my mouse any faster, that’s for sure. (link)

I just saw Tropic Thunder in theaters and was pleasantly surprised! Ben Stiller grabbed Jack Black and Robert Downy Jr. as well as the curiously unmentioned Tom Cruise and Matthew McConaughey and made an actually worthwhile comedy! Now I can forget about The Heartbreak Kid! *gag*

A new company has E. Coli crapping out Diesel. (link) It works like Cellulosic Ethanol (organic matter –> product). By producing Diesel instead of Ethanol, existing infrastructure is already capable of transporting and selling it. Ethanol requires a slightly modified engine and more expensive oil pipelines because it is more corrosive than normal gasoline. The start-up company responsible says with a few genetic modifications the E. Coli can also produce normal gasoline and even jet fuel! Very cool, but until this process is scaled up thousands and thousands of times, it isn’t much more than a proof of concept. The E. Coli used is allegedly harmless though, can anyone say home-made diesel?

I can’t remember if I already posted this, but check out my favorite site on CFLs and LEDs, seriously, click the link. When you get bored of the cave, pull on the lever. With recent LED breakthroughs, hopefully we can just forget about the CFLs and transition completely to LEDs. Of course, it’ll take at least 2 years (it always does).

Space Siege was just released, on Steam even! Too bad early reviews (including a review by my favorite video game editor, Jeff Green) call it absolutely average. I’ll still be picking up the game (probably on Steam), but I’ll wait for the 20 dollar price drop in a few months.

And finally, wonder where I was? (click for full resolution image)

Canada Coast

Integrated Graphics, AMD’s PUMA

IGP (Integrated Graphics Platforms) have been the bane of gamers, especially mobile gamers, for many years. Created for use by casual computer users who want little more than to write Word documents, play flash games, and watch YouTube, they often fall well below the mark for acceptable DVD/Blueray playback and PC Gaming. For mobile users, the choice between a dedicated GPU (Graphical Processing Unit) and an Integrated GPU merits consideration. Integrated GPUs use much less power, create less heat, and result in an all around more mobile system. Dedicated GPUs use more power, create much more heat, but allow gamers on the go to play the latest and greatest with a few tweaked settings. Intel’s IGP solutions, the GMA series of integrated graphics (GMA 915, GMA 950, X3100, upcoming X4500), have been the most common and also the most frustrating. Intel doesn’t do graphics, so their IGPs are very hit and miss, especially in running games properly. ATI/AMD’s solutions, such as the X1250, as well as Nvidia’s solutions, such as the 7150, provide more acceptable gaming performance, but still not enough to play high intensity games very effectively at medium/high settings.
AMD/ATI finally broke the mold, their new 780G motherboard chipset contains the HD3200 IGP, which offers a significant performance improvement over all other IGPs. While a dedicated GPU will still perform better, the HD3200 is an important step forward. In a mobile system, the 780G is part of AMD’s PUMA platform, which also offers an external port allowing an external GPU to be connected to the laptop, something not practical or effective until this point. The following video compares the HD3200 with an older X3100 (which has been replaced by the slightly improved, though still lacking, X4500).

AMD’s 4850

Recently I wrote about Nvidia’s new GPU series, the GTX 260/280 cards. These beasts cut a swath through the current video cards, handily stomping anything that had come before, but they did so at a high price, $400 and $650, respectively. Gamers appreciate these cards, but often it is hard for anyone to justify so much money on a single component when an entire computer can be purchased for the same amount.
Radeon HD 4850

Now, various sites have released benchmarks of the AMD 4850, and it has me happily surprised.

I spent a large enough word count getting into GPU specifics last time, so I’ll just outline the major points of this card. First, it performs very much like an Nvidia 9800GTX, which (before today) ran a hefty $300. That should stop most people right there, a 200 dollar card offering the same (or better) performance as a 300 dollar card. Well, in an attempt to rain on AMD’s parade, Nvidia has drastically cut the price of the 9800GTX down to $200! So really, considering the prices, the real fight is between the AMD 4850 and the Nvidia 9800GTX. They trade blows in most benchmarks, in some Nvidia pulls ahead and in others AMD is the champ, but in most cases performance differs by a small margin (10% or less). That said, the 4850 runs slightly cooler and requires slightly less wattage than the 9800GTX. It also only requires 1 six pin power supply connector, while the 9800GTX requires 2.

Crysis DX10To the right, you can get a taste of how well the 4850 performs on the GPU-eating game we call Crysis (at various resolutions). Take a look at how well the 4850 performs not only against the 9800GTX but also against its fairly successful predecessor, the 3870.

Soon the 4870 will be released, offering MUCH faster memory and higher clock speeds. Some (highly excitable) reviewers have estimated as much as a 40% performance increase compared to the 4850. Soon after that will come the 4870X2, fusing two 4850 cores into a single piece of PCB board, and possibly beating the GTX280 to a pulp. Still, the price wont be quite as sweet, most estimations of the 4870 price range between $250 and $300. This will make for an interesting show in the coming months.

AMD/ATI has really blown the lid off this generation’s GPU battle. They have a competitive GPU at a competitive price with competitive features, and it is smaller, cooler, and slightly quieter and more power efficient than the 9800GTX. We haven’t seen this level of competition for almost 2 years now, and I’m glad it has returned.

Nvidia GTX 280/260 Benchmarks Released!

In advance of the launch tomorrow, any tech site worth its salt has released a hands-on review of Nvidia’s next line of GPUs. So far the nomenclature of the new series completely resets Nvidia’s old system. Previously, the last Nvidia line of GPUs ended (presumably) at the 9800GX2. Now they are back in the hundreds and have placed a GTX prefix in front of the entire line (so far).
Ok, on to the random bits of information. This beast has 1.4 BILLION transistors. Nvidia’s last high-end card had only 754 million transistors, so we are seeing almost double the brute force capability. Let’s show you a little comparison between a Geforce GTX280 and a top of the line dual core Penryn CPU from Intel:

GTX 280 Die Size

Now technically, in a simplified sense, the older generation of Nvidia GPUs had 128 cores. Consumer CPUs, at most, have 4 cores. The new generation of Nvidia GPUs has 240 cores. That is an INSANE number of cores compared to a Central Processing Unit. Maybe that’s why the GTX 280 can suck 236 Watts at full bore. Now you might wonder, why not replace or augment your CPU with all of that crazy power in your GPU? Well, the industry is actually moving in that direction. The primary roadblock is the fact that GPUs process data in a very specific, specialized way and CPUs are built to process data in a very general way. GPUs are generally built to work with pixels, while CPUs are built to work with ANY data. We’ve already seen GPUs used by scientists to do brute force calculations much faster than CPUs, and we’ll see a more mainstream/consumer fusion of the two components in late 2009.

So how much faster is it?! Compared to the previous single-card solution from Nvidia, the 9800GTX, it is roughly 50% faster in most games. Running the game Oblivion, here is a benchmark graph stolen from comparing the top performing GPUs at maximum resolution and maximum detail. This resolution is so high that it is confined to 30 inch monitors, most users will not be pushing their GPU nearly this hard. Score is presented as number of rendered ‘frames’ per second.

GTX 280 Benchmarks

The GTX280 will cost $650 and the GTX260 will cost $400.

GTX 280

Nvidia has done it again, the fastest single GPU in existence. AMD/ATI have their work cut out for them, we already know their 4870 wont be as fast as the GTX280, but theoretically the pricing should be MUCH lower and multi-GPU solutions could end up giving them a competitive advantage performance-wise. We’re in the midst of another GPU war! It’ll be a few more days till we get to see direct comparisons between the best from AMD and the best from Nvidia.

I. Can’t. Wait.

Week of 6/15/08

We’ve got three events upcoming!
On the 17th:

Firefox 3 will be released. Everyone go out and get it on launch day, we are trying to set a world record for most downloads in a single day.


The Spore Creature Creator will be released, more on this Tuesday.


On the 18th:

Nvidia releases their next line of GPUs! It seems only a few months back the 9000 series hit the shelves, now the GTX 200 line is already at our doorstep. AMD will be releasing their 4000 series early next week in retaliation.


Intel at it Again!

A few days back I wrote an abridged history of the CPU, spotlighting Intel and AMD in their never ending battle for supremacy. Today, one of my favorite technical/review sites (AnandTech) snagged an early revision of the Nehalem architecture (Intel’s next big chip) and ran a few benchmarks. AMD must feel crushed, because Intel pulled out ALL the stops.
Nehalem chips wont be available to consumers until then end of 2008 and beginning of 2009. They offer as many as 8 cores, each ‘HyperThreaded” (a technology used in Intel’s older Pentium 4 chips) to create twice as many logical (processing capable, virtual) cores. The biggest, baddest consumer Core 2 available today comes with a maximum of 4 cores. Testing one of the 4 Core 2.66GHz Nehalem CPUs against one of Intel’s 4 Core 2.66GHz Penryns (updated Core 2 ‘Conroe’), the Nehalem still put the hurt on Penryn on a clock for clock basis. In other words, even at the same “GHz” the Nehalem is much faster.

nehalemNehalem at Computex 2008 in Taipei, China.

To quote Anand himself, “First keep in mind that these performance numbers are early, and they were run on a partly crippled, very early platform. With that preface, the fact that Nehalem is still able to post these 20 – 50% performance gains says only one thing about Intel’s tick-tock cadence: they did it.”

Technological Leftovers

Finally! I went to BestBuy today, and I didn’t see a SINGLE FX series Nvidia graphics card. Why is that significant, you ask? BestBuy, OfficeMax, OfficeDepot, and Staples have been selling FX series Nvidia cards for several years, even though the FX series is almost SIX YEARS OLD! SIX. YEARS. OLD.
Do they even understand that the slot those cards fit into (AGP) has been discontinued for over a year? Do they understand that even considering that fact, there are the far superior 6 AND 7 series cards capable of fitting in these old slots? I am glad they finally stopped stocking them, but they should stop stocking the 6 series as well. Those are almost 4 years old, and yet they charge almost 100 dollars for them–for the lowest end model–which you can’t even buy online because they are so old. Holy Hell, Nvidia is currently selling their 9000 series cards, and are just about to release their NEXT generation in less than a month!

The worst part is that the average consumer has NO idea that the ANCIENT hardware they are being sold at OfficeMax is ANCIENT and extremely overpriced! All the boxes are marked with ‘timeless slogans’ like ‘Next Generation” or “SUPER POWERED PERFORMANCE” so how are they supposed to know that it is so old it can only run one YouTube video?


State of the Art Hardware. On Sale at Office Max for only $2000!

If BestBuy were still charging 100 bucks for a cassette tape walkman, would people be outraged? The only reason these stores can pull this crap is because people are uninformed and they know they’ll probably never be informed unless they are nerds like me. Bastards!

*sigh* Ok, so BestBuy down, Staples, OfficeMax, and OfficeDepot to go.


The processor is undoubtedly the most visible technology showcasing the pace of technological advancement in our society, we can proudly say our cellphones are faster than the building-sized ENIAC of the 1940s, and this is all due in part to the rapid development of the transistor and on a larger scale, the processor. These days we’ve got dual cores, quad cores and soon octo-cores coming down the pipe at speeds nearing 4GHz. I think people should take a look at how far we’ve really come, and if we really need to go much further in the current computing era.
First, a bit of recent background. Several years ago, Intel was focused on it’s blow dryer worthy Pentium 4. They thought the future was in higher clocks and bigger chips. AMD, on the other hand, had cooler chips at lower clock speeds (meaning the number of GHz or MHz) that were still outperforming Intel’s best chips. AMD thought to put two of their processing cores onto a single chip, effectively doubling the performance without increasing the clock speed. While in reality this didn’t offer perfect scaling (meaning it didn’t offer quite 200% improvement over a single core), it greatly increased AMD’s lead over Intel. Intel responded by putting two Pentium 4 cores into a single chip and naming it a Pentium D. The Pentium D was the apex of Intel’s volcano based processor technology (called the Netburst Architecture). Little did anyone suspect, Intel’s Israel team was working on a processor named the Pentium M, aimed specifically at mobile systems (laptops), that would later become the progenitor to their massively successful Core 2 Duo.

The Pentium M was so much more efficient than the Pentium 4 that Intel shifted their entire mainstream line from Pentium 4/D to a modified M architecture. This was the beginning of the Core architecture. By modifying the Pentium M for desktop use and using two cores, Intel was able to significantly lessen the divide between their processors and AMD’s X2 line. A little while later, the Core 2 models came out and Intel was back in the lead. They even tried putting 4 cores on a single chip, creating the first consumer quad core chips.


AMD has been in quite the bad spot for the past couple of years, never quite able to catch up with Intel in a timely manner. But that is beside the point. Both companies offer affordable processors that have four cores (AMDs Phenom X4 line and Intel’s Quad Core 2 line) and fairly high (above 2.4Ghz) clock speeds. While Intel still has a noticeable lead in performance, I’ve begun to wonder if processors have reached a point where leaps in performance are no longer necessary for most consumers. Most software cannot even take advantage of the hardware given to them (they are built to work with at most 2 cores, but they can’t actively work with more). Unless you are editing video or batch processing data, more speed isn’t all that necessary.

Consider the fact that the fastest Pentium 4 took 225 seconds to Render a 3D image, while a semi-current Intel Quad Core, the QX6850 can render the same image in 38 seconds. The newer processor is just about 600% faster in this case.


Most computers are bottlenecked by their Hard Drives and RAM (amount) far before the processor is a problem. When playing games, it is the GPU being stressed the hardest.

So! With Nehalem on the way for Intel, and Bulldozer and Fusion for AMD, processors will continue to become faster and more threaded (meaning many more cores). I do wonder though, how long will it be till the software catches up? When will going from 2 to 4 to 8 cores really show me faster and better computing outside of media manipulation? Is there anything that actually needs more speed in the first place? Something tells me we need more than just gaming to continue pushing processors forward.

Wireless Network Crazyness

33-127-143-06I’ve finally moved on up to WPA2. Network setup is by far the most confusing and challenging aspect of computers I’ve come up against so far. Our first Dell 802.11b router (2001) took 6 months to get set up properly, and even then, all we could do was share internet. We’ve come a long way since then. I initially thought I’d shift our whole network up to Draft 2.0 N for the speed, but the cost of upgrading all of our adapters stopped me. Instead, I considered moving up from a mixed b and g network to a solid g system. I had to get a couple of cheap adapters, but in doing so I improved network throughput by 50%. I can’t believe I didn’t do it sooner. I realized a few hours later that I could dump my pathetic WEP security, which I cracked one day for fun while waiting for a TV commercial to end, for superior WPA2. Now I know that Draft N has severely degraded throughput when using WEP or WPA1 (cutting speeds by as much as 80%), but I had no idea the same held true for my oldish G router.

So, I set it up and suddenly throughput improves by over 250%. Two Hundred and Fifty percent…seriously…

Well, anyway, after shifting encryption algorithms a couple of times, I finally found what I think are the most secure settings (thank you NSA!), and we’re off! No need to drop 500 bucks on a new N network, my old G still had room to improve (300%, in fact).

Maybe I should go into network administration…