Thursday, December 28, 2006

Zune only has 2% of the market? Who would have guessed.

Checked Dailytech and one of the current headlines is about the success of the Ipod yet again during Christmas time :

Microsoft's Zune also found space in the article, but I'm sure a lot of readers are familiar with the adage that a bad promotion is just as good as a good promotion. The idea being that even if you create a ruckus about a product through bad advertising, people who report on your bad advertising have to mention your product by name anyways, which results in people hearing about your product anyways. Rockstar Games has used this concept to great effect with Grand Theft Auto, Bully, and other games. News agencies complaining about the games have to refer to the Rockstar Developers and refer to the games by their direct names, which means that the average person watching FoxNews is going to hear about Grand Theft Auto. They may not understand, but the name is still placed out there. This is also how Jack Thompson got such prominence, if you were asking me. But I digress.

There appearently is some surprise within the Music Industry that the Zune has taken a nosedive, not that Steve Ballmers infamous Squirt terminology was helping the system. Appearently a lot of people within the Music Industry were supposing that the Zune was going to be a success. Afterall, it was selling right next to the Red Hot Xbox 360 and Nintendo Wii systems right in Gamestop. It surely is/was a better media player than Sony's MYLO and Playstation Portable system. Why would it fail?

I, for one, do not find the Zune's performance surprising at all. The Zune is basically a repackaged Toshiba Gigabeat ( ), a player that almost nobody has ever heard of, also for good reason. Now, I could be wrong about this, but I wasn't able to find a positive review about the Toshiba Gigabeat, most of the reviews I've seen lumping it along with Dell's failed Ditty.

That being said, it is plausible that the Zune could pass One Million Units shipped by June 2007. However, there is a key word in there that I think a lot of people are skipping over, and that is the word shipped . Now, we've already been in this mess before with Sony and the original Playstation, not to mention the Playstation Portable. Shipped units are not sold units. We've also had this before with the original Xbox from Microsoft as well, which used Shipped Units in the US instead of sold units to give a perceived higher sales numbers than the competing GameCube from Nintendo. While the Xbox eventually did take a substantial lead in the US, many analysts still try to use the shipped numbers instead of retail sales numbers.

The catch is that Units that are shipped from the Warehouse are not sold units. Fact is, Sony and Microsoft have a long history of embellishing sales numbers by either mis-reporting the numbers, or by changing the way the numbers are presented. I, personally, already hammered Sony on that back on Monday 23rd of January 2006 at 08:30:01 AM on Gamenikki's blog. The Sony PSP Market dissection is reprinted here :

To quote myself

The question is why? Why would Sony not expect to sell as many PSP units in 2006 as it did in 2005? We'll get to that later, but for now, lets take a look at another piece of the puzzle.

At this time there is no doubt about the domination of the PS1 and the PS2 in the gaming world. The PS1 is by far the worlds most popular console, ever, and the PS2 is by far the most popular of the current generation consoles. It was not always like that though. During the late 1990's Sony was actually known to pull many tricks in order to make the sales of the Playstation appear to be better than sales of competing consoles, such as the Nintendo 64. Sony was known for counting consoles returned for repairs and shipped back out as new consoles. Another one of the tactics used back then is that Sony would list the number of Playstation units shipped out as the number of units... um. Well, the number of units sold.

Sounds suddenly familiar doesn't it. Sony didn't sell 10 million plus PSP units, but they did ship that many. Now, what I want to ask, is what the average user going to think when they browse through press listings or blurbs on a news site. Those are such big numbers being tossed about, and the idea of a million anything to an average web page browser is indeed a lot. What exactly looks more competitive? 9 million plus handhelds sold versus 13.5 million handhelds sold, or 10 million plus handhelds versus 13.5 million handhelds
The fact is, Microsoft isn't promising that one million units will have been sold by June 2007. All they promise that one million units will have shipped from the factory. Now, if Microsoft wants to step forward and say how many Zune's they actually expect to sell to end users by December 2007, I think that number is going to be a few hundred thousand short of a million.

There are several reasons why the Zune isn't selling, the least of which being the terminology associated with the unit, the restrictions placed on sharing music, and the ugly design. The fact is, the Zune competes directly against "Plays for Sure" devices. Microsoft, as far as I see it, turned on their own sales partners and is reselling a "Plays for Sure" device without the "Plays for Sure" support. In addition, the Portable Music player field is littered with entities that have come and gone. Sony who used to dominate with the Walkman can't beat Apple. RIO, who created the market, is where now? Creative, who owns the high-quality home consumer sound market for computers, hasn't been able to dent Apple.

So why would analysts believe that Microsoft would be capable of competing where the people who created the markets failed? Money isn't everything. Microsoft may have billions, and they may have their Tax on new computer systems, but, media players aren't locked into a personal computer.

Now, i'm tempted to continue on about how Microsoft can't leverage their illegal monopoly. At this point, I'm going to direct you

I'd start here :

Although this page gives a more convenient listing of all the articles.

Btw, slight word of Warning. The writer of the Roughly Drafted Articles is very Pro-Apple, and does not really effectively cover or view Open-Source. Cited examples include that Linux isn't ready for the desktop, something that Mepis disproved back in 2003, and Ubuntu has helped enforce.

The writer also attempts to make political jokes about the US Government. As best I can figure, it does not appear that the Roughly Drafted writer actually ever took Government 101 in any high school. So, do keep a keg of salt on hand while reading through the articles there.

Wednesday, December 27, 2006

Chewing out Dailytech responders, one idiot at a time.

yes, Trigger Tyme and Paintball has been pushed back. I've gotten some emails that... um. Well. Lets just say that I'm not wanting to get in the middle of whatever tiff is going on between Trigger Tyme and an apparent competitor.

Anyways, the subject of today's post is yet another person repeating an oft-disproved lie about Microsoft products. So, here we go. This is the original post I made on Dailytech, in response to Vista Activation getting cracked again.

I found this line to be, well, hilarious:
Despite Microsoft’s best efforts to shut down this latest exploit, it does leave us wondering jus how secure this new operating system if it can be poked at and prodded this early after release.
Many of the security features were already being cracked a year ago or more on the beta's that were going out. Over the past 6 months as the final release firmed up, the known exploits didn't stop. We knew literal months ago that Vista was no more secure than a properly configured and hardware firewalled protected Windows XP or Linux OS.

The end fact is, it's still going to take work to protect any computer system, any OS. Vista has changed nothing in a System's Administrator job, end of story. We don't have to wonder. We know that inherently, any Microsoft product will be less secure out of the box than it's competitor. We know that people will still be earning paychecks fixing that.

Someone came along shortly after that and ran this comment:

Only less secure because the "competitors" have 3% market share, therefore nobody gives a (manure) about cracking them.
Oh... yeah. The old Market Share equals more attacks argument. Lets go ahead and eliminate this bunch of marketing mud right now.

Slight problem with that Sharky.

Mind telling me what Web-server powers the majority of Web-sites in use today? And what Operating system is the basis for the most servers in use today?

I'll answer for you: Apache and Linux.

Despite having a majority in server usage for games, banking, shopping, web-pages, ftp, or anything else where a server is used, you don't hear about major Zero-Day or Near-Zero-Day exploits for Linux or Apache.

The line that Microsoft products are less secure because they have a majority in the consumer market is inherently based on several false assumptions.

The first assumption is that Microsoft products are easier to crack. This is based on the sheer number of exploits that make appearances in the wild. However, there is very little code documentation available for Microsoft software. With Linux and Apache there is ample documentation available, and even beginners in coding can grasp the way the programs actually work. The fact is, it is easier to modify a program that is fully documented rather than one that is not. If malicious crackers wanted to make SoBig and IloveYou look like childs play, they would have attacked Linux and Apache servers.

The second assumption is that all of the computers sold by Microsoft-Licensed OEMS remain with a Microsoft OS. This is partially based in the so quoted "Microsoft Tax." Nearly every complete computer system sold on the market ships with a Microsoft OS. However, even a casual browsing of Distrowatch, or casual browsing of the forums for Ubuntu, Mepis, PClinuxOS, and other versions of Linux show something startling. There aren't just hundreds of posts from people reporting they've "switched", there are several hundred thousand posts. The problem, if you want to call it that, is that there is no reliable metric for determining what Operating System somebody is running on their computer. So, while it's safe to say that Microsoft probably has greater than 50% market share, assigning any percentage above that to Microsoft ignores ample evidence to the contrary.

The third assumption is that all cracks are equal. Now, I have not personally done this, but I do know people have taken the time to dig down through Symnatec's reports on Windows and Linux vulnerabilities and existing exploits for those vulnerabilities. Now, we've all heard the marketing from Microsoft that it is more secure than Linux, and we've all heard from Linux developers that Microsoft isn't telling the whole story. Symnatec bears that out. We've all heard about the penetration rates for Windows virus's. Hundreds of computers are affected each month by a new vulnerability. Spyware and Adware code is on a rampage. However, when looking at Open-Source OS's like Linux, and Applications like Apache, over 70% of the reported viral exploits available only have been found in Symnatec's testing labs, with a penetration of 2 computers.

So, not all cracks are equal. Not all exploits are going to do the same damage, or initialize in the same manner. There are several people who do care about cracking Linux and Apache. They just have paying Jobs from the likes of IBM, Novell, RedHat, and Ubuntu.

Now, please stop the F.U.D. that Microsoft products are less secure because of Market Share. Confirmed majority marketshare is only true in one particular market, and does not apply or indicate the industry as a whole.

Something else popped into my mind while I was posting this into Blogger.

Vista isn't even out yet for the average user. I can't go down to Best Buy and get a copy of Vista off the shelf right now. Most manufacturers are not even selling HomePC's with Vista right now, almost all holiday models are all Windows Xp.

So, right now, Vista doesn't even have .01% of the home market, and it's server penetration rate is far less. There is also a known issue with SQL Server 2005 Express, which does not run on Vista.

So, right now, Vista has the overall penetration of say, BeOS, RISC-OS, or maybe Amiga. This brings the market-share ratio F.U.D. into perspective. Vista does not even have the market share of BeOS, and it already has more major published vulnerabilities and exploits than either 3 of it's market-share peers.

Friday, December 15, 2006

Triggertyme... not yet :: 95% of who supports Novell Microsoft deal?

The torrenza post had been sitting in que for a few days, I wanted to toss it out after TriggerTyme. But, while writing the TriggerTyme report I noticed I was creating both a "report" of the weekend and a "Why I play Paintball" entry.

So, that's now been split into two entries...

While I finish those up them, I wanted to comment on this story from Cnet:

Microsoft is touting that in a survey of 201 computing industry decision makers, 95% supporting the agreement. Now, back on May 8th of 1998 Illiad's Userfriendly comic strip commented on Microsoft's survey about whether or not Windows 98 should be stalled by the goverment.

Userfriendly commented that Microsoft had 835 directors and Vice Presidents, which would have given the required 5 to 1 ratio out of 1002 respondents. Back then, it made a nice joke, but the point of the comic strip still stands.

Microsoft does not clarify who was polled, or for which companies. All Microsoft states is that 201 computing industry decision makers. What is the metric for determining how these decision makers were polled. Was IBM polled? Was Sun Polled? Was Red Hat Polled? Was Open Office polled? Were FireFox developers Polled? Were Apache developers polled? Were the SAMBA developers polled?

The quick answer is, no. None of the developers who have made industry changing decisions in the past 10 years were polled.

So who did Microsoft poll then? Corel? SCO? SGI? Their own employees? Novell's employees?

Until Microsoft comes forward with clarification on who was polled, or agrees to run a new poll and explicitly ask the above mentioned companies and developers their stance on the Novell/Microsoft agreement, we should file this as shameless promotion and ignore it.

Sunday, December 10, 2006

Torrenza: Why AMD may not have to introduce a new processor

Over some of the past entries I've commented on Intel's Conroe/Core2 design inability to scale performance with processor speed. I've also commented on the Athlon64 design which does scale predictably has the speed increases.

However, AMD has yet to introduce a direct competitor to the ultra-high end Conroe/Core2 designs, such as Kentsfield, and nor does AMD seem to be in a hurry to do so. Many AMD fanaticts seem to be convinced that AMD will answer Intel Shot for Shot, and fully expect Quad-Core to regain the performance crown. Many analysts seem to believe that AMD will release another version of the K8/Hammer design that will make up the few percentage points lost to Core2.

AMD may, but they don't have to. Back in 2001 and 2002 AMD was beginning to show off the architecture they called Hammer, a fully 64bit processor with amazing potential. Early tests showed that Hammer was putting out performance numbers on par with AMD Thunderbird and Intel Willamette and Northwood designs with a 1ghz clock advantage.

When the AthlonXp 3200 launched, AMD knew that the Xp line was running out of steam. The Intel 3.2ghz Pentium4 was the clear winner in a majority of benchmarks. Rather than continuing to answer shot for shot with Intel, AMD pushed Hammer, the Athlon64 processor. Remember what Intel initially said about x86-64? That 64-bit was useless and that nobody wanted or needed it? And what actually happened as time went on? Intel only began x86-64 support with Prescott in 2005, 2 years after AMD, and even then applications compiled for x86-64 specs under Linux wouldn't run with Intel's version.

AMD redefined the battlefield rather than continuing an arms race of who could get the fastest clocks out of the processor.

Just the same with Core2 and Athlon64. AMD has already been in this position before. Intel, while slow, is capable of competing. The Athlon64 would not always be the most powerful processor available, AMD has never said otherwise. Just as the Pentium4 3.2ghz went beyond the AthlonXp 3200, something would go beyond Athlon64.

And again, AMD has redefined the battlefield, this time with the processor inter-connection than with a new processor design.

Torrenza is basically cache coherent Hypertransport, allowing for processors or chipsets to be directly aware of what other processors or chipsets are doing. By opening up CC-HyperTransport, AMD can link their processors directly with other processors that may not be of Athlon64/Hammer design. As is, Sun Microsystems, HP, and IBM are behind socket compatibility, placing their processors into AMD sockets. Sun is known for it's Sparc Architecture and the most recent Niagara design. HP has the PA-RISC architecture and what is left of the Alpha Architecture. IBM is one of the primary backers of the PowerPC architecture, and while Sony may be failing with Cell in the Playstation3, the Cell design is not one that can be dismissed.

Now, let me try to put some context here:

In the common computer today it is taken for granted that not all processors process in the same manner. Some processors are better at some tasks than others. AMD and Intel typically focus on the Central Processor, the brains behind the computer. The central processor collects, collates, arranges, and directs the computers other systems. ATi and Nvidia typically focus on graphics, designing high performance add-in cards to increase visual display quality. Creative markets a sound processor that has a focus on producing the best sound possible. AGEIA markets an add-in card that removes physics computations from the central processor. BigFoot makes and markets the Killer NIC, a hardware networking card.

There is no question that each of these different dedicated hardware parts are better in terms of performance than integrated parts. The difference may be noticeable in only a few cases, but it is there.

Now, given that we take for granted that a computer can have a multitude of different tasks that are each served best by one architecture, is it really too much of a stretch to think that the same applies to the Central Processor?

Sun's Niagara design is a monster in several server tasks, but even with the most powerful video card in the world, don't expect it to be able to run Unreal Tournament 1999. It is not built to run that kind of code.

What Torrenza enables is multi-central-processor systems that don't have to be one architecture. Lets say that somebody running a webserver may also want to host a game server as well. Today, that means having to buy two different computer systems. One that is tuned to serve web-pages and one that is tuned to serve games. With Torrenza, that person could buy one server, equip one socket with an Athlon64, and the other socket with a Sparc Niagara design processor, and be able to do both.

That is how the battlefield is being redefined. AMD's Athlon64 is not the be all/end all processor design. There are some tasks that it just will never be as good at. With Torrenza though, consumers can use a completely different processor which can be stronger in areas that the Athlon64 is not strong in.

This is why AMD doesn't have to rush another processor architecture out the door, or ramp the speed up on the existing Athlon64 designs. A single server with an Athlon64 processor and a Cell processor inside would render Conroe processor designs irrelevant. Intel can have the single application performance crown, that won't help when the competition is able to answer Conroe processor designs with equal performance in similar tasks, and also give a devastating performance difference in tasks that the Conroe architecture is not designed for.

Now, I do need to state that I am intentionally trying to avoid the Operating System question in relation to Torrenza. There is only one Operating System available that could possibly pull this merger of architectures off, and its not from Microsoft. It may also be years before we actually see anybody putting anything like this together. However, consider this:

took 2 years to add x86-64 support into Pentium4
is 3 years behind AMD on delivering comparable x86-64 performance
is at least 5 years behind on a Direct Connect Like Architecture

By the time Intel gets their Direct Connect Like Architecture out the door, Torrenza could already be fulfilling the scenario outlined here.

Firefly MMO Flop in Progres |or| Intel Larrabee Flop in Progress

Recently two bits of information popped up on some news feeds that have caused a stir. The first is that Multiverse has licensed FireFly/Serenity and will be using thier in-house MMO engine in order to make a FireFly MMO-RPG. There have also been wraps taken off of Intel's Larrabee project which aims to take over Nvidia and AMD/Ati as the provider of high-end graphics cards.

Wired News originally reported on Multiverse's FireFly acquisition here:

Apparently Multiverse intends to use FireFly as a highlight and high profile example of what their engine can do. Now, when I went to I noticed a blurb on the side that states this:

November 9, 2006
Multiverse > SL > WoW?
Which is the future of virtual worlds? Experts or the masses? Multiverse appears to be hedging its bets by allowing its tool set to be used by either.

That is what starts ringing bells that something is wrong. Second Life is only able to handle a few thousand people at a time, is very slow loading, and generally looks like a lab project from somebody learning to make 3D objects. World Of Warcraft can handle a few million people at a time, is much faster on loading, and looks like a professional grade graphics project.

As you poke around the Multiverse site, something else jumps out. Multiverse does not actually make any games. Rather, they make a quoted Technology Platform for games. In otherwords, all Multiverse does is make a GAME ENGINE and expects to license the game engine out like ID Software (Doom3 Engine), Epic Software (Unreal Engine 3), LithTech (Jupiter EX), or CryTek (Cry Engine).

Forgive me if I think that Multiverse is already on the road to failure with their objective. The problem is, all Multiverse has to show off their engine is a Demo World that comes across as a cheap copy of Second Life. Most other Engine Providers generally have a killer application that show off what the Engine can do.

ID Software's Doom3 engine is demonstrated by Doom3 and Quake4, both high selling titles with massive franchise history. The games Massive Multiplayer capabilities are being shown off by the next Enemy Territory game.

Epic Software backs Unreal Engine 3 up with Gears of War and Unreal Tournament 2007. As with Unreal Tournament 2003, a headline game shows off what the new engine is capable of. With the expansion of Assault into Conquest, Unreal Tournament 2007 also shows off the Engines capability to stream maps, big news for MMO Type Games.

LithTech works closely with spun-off developer MonoLith on the Jupiter and Jupiter EX Engine. Unlike other high profile Engines, LithTech's premier title does not carry the name of the Engine. On the other hand, I don't think Jupiter or EX could be worked into F.E.A.R., one of the few FPS's I ever felt was worthy of a perfect 10.

CryTek made a huge splash with Far Cry and Crysis, titles that set new records for how games should look, as well as potentially justifying spending over $500 on a graphics card to get those looks.

The point here here is that Multiverse does not have a killer application on tap to demonstrate what the Multiverse engine can do. Rather, what Multiverse is trying to do is get someone else to build the killer application. Their plan for the FireFly MMO is to hire a development team to bring the world to life. Unfortunantly, the plan has a fatal flaw in it:

No one is available to hire who can program a Complex MMO.

Think about for a second. The MMO-Market is flooded with development teams. Consider NC-Soft and Webzen for a minute and the amount of developers they have. Consider Sony Online Entertainment and Blizzard for a minute. These companies are already aggressively highering mod makers, coders, and college students who demonstrate skills applicable to the MMO markets.

GuComics ( runs a comic panel every so often about The Zapper, commenting on deaths within the gaming industry. On August 16th the Zapper Panel listed several companies/games who had gone under. The MMO market is just littered with failed projects that had massive marketing budgets and formally high player counts. One of the most common problems is that the development teams simply could not put together a game that a lot of people wanted to play, the same problem NC-Soft has right now. NC-Soft has a lot of great games such as Auto Assault. Is it good enough for me to pay every month for? No.

To put this in even further perspective, consider NC-Soft's upcoming Tabula Rasa, found over here: Tabula Rasa has gained a lot of interest because of the person who is creating the game, Richard Garriot, AKA Lord British from Ultima. A lot of players eagerally await the game because they know what Richard Garriot has done in the past. Also consider City Of Heroes from NC-Soft, headed currently by Matt Miller, the creator and player of the character named Positron. If Matt Miller were to up and leave Paragon City, I'd be tagging right along behind because I love CoH. I, personally, would have never looked at SOE's Planetside game if it were not for the fact that the creators behind Tribes were bringing the game to life.

The fact is, once you know who a good game designer is, you'll be more likely to follow that game designer around if they move to another game.

Now, keeping that in mind, start putting the picture together. Multiverse is trying to get a development team together to produce an MMORPG for a high profile property. The problem is, the only people available to really pull together on an MMO Project are either people not yet in the MMO industry, or those who have created games that failed in the MMO markets. People who have the talent to program great games are going to be actively courted by several different developers with high profile projects. Lets say I get approached by NCSoft or Blizzard to be a writer, and I also get approached by some unknown company to to write as well for a long term contract. Which one would I choose?

I'd go with the company that has an established track record.

let me lay it out in a shorter form

Multiverse has
-no Killer Application on hand
-no well known developers on hand
-no reliable source of developers to compete with NCSoft, Webzen, SOE, and Blizzard

The phrase, Cart Before the Horse comes to mind when looking at what Multiverse is trying to do. The good news for Multiverse is that they aren't the only ones who have run into this problem.

Intel also has a problem with putting carts before horses, and that would be Larrabee. The-Register reported on Larrabee over here: Basically, Intel feels that they can take the high performance video accelerator market away from Nvidia and AMD/ATi. Like I think Multiverse is headed for failure, I think Larrabee is also in trouble.

The problem is that like game development, the pool of people available who are capable of designing high performance graphics is relatively small. One of the many reasons cited by ATi over the past several years about shying away from Open-Sourcing their graphics drivers is that everybody who could understand the drivers was already employeed by ATi, or employeed by Nvidia. Looking at the market, ATi is/was probably right about that. Like the MMO markets, ATi and Nvidia have not been the only players, they've just been the best so far.

Consider 3DFX, makers of the popular Voodoo Series of cards several years ago. Despite owning the performance market for games during the time of Windows 95 and 98, 3DFX is now property of Nvidia. Despite having a great API for graphics and fantastic software support, hardware misteps took 3DFX out of the market.

Consider XGI, makers of Volari. XGI was born when Silicon Intigrated Systems purchased Trident and spun off the Xabre line of graphics accelerators. Xabre was decently priced with decent drivers, but really could not make a dent against their competitors. XGI's hardware looked great on paper, but driver support was never good enough to allow the cards to function. XGI was aquired by ATi.

Consider S3, backed by Via. S3's Chrome cards have barely carved out a niche among Home Theater Enthusiasts. Despite decent pricing and decent driver support, they have not been able to crack open more than sliver of the graphics market.

What about former graphics great Matrox? Their Parhelia chip offered fantastic triple screen support, the performance of a Geforce4 TI 4200, at over 3 times the price. That was back in 2003, and since then Matrox has been near silent in GPU discussions.

PowerVR was another high profile, thier last major design win going into the Dreamcast. Did anybody actually buy a Kyro card though?

The landscape for graphics development is an extremely competitive one. While there have been many who have approached the graphics accelator markets over the years, competition for the add-in market drove several out of business or into mergers. ATi and Nvidia aggressively recruit anybody who has talent with drivers or hardware design, and have done so for several years. Even when some of the purchases didn't make any sense to begin with, such as the 3DFX and XGI purchases, the results spoke for themselves. Nvidia brought back SLI, and XGI's hardware designers are no strangers to multi-chip setups either.

So, lets put this in perspective then. ATI and Nvidia have a long history of spending millions of dollars to recruit anybody with talent in their industries.

Intel who has never really left the graphics market, is openly laughed at and mocked by game developers (Mark Rein for example) for their horrible integrated graphics.

Larrabee is a great idea, sure. Competition needs to exist. But lets be honest. If you want to work in graphics, either writing drivers or designing hardware, and you get approached by Nvidia, ATi, and Intel, who would you go with? Would you go to work for the companies with a reputation for high performance graphics? Or would you go to work with a company known for making Integrated Graphics that can barely run Unreal Tournament 1999.

Again, anybody who is competive or talented in the field is probably going to go for the companies with proven track records.

Now, maybe Intel can pull a Rabbit out of it's hat. Maybe they can make a competitive graphics chip based on General Purpose computing.

Keep this in mind though. AMD launched and sold Athlon64 processors in bulk back in 2003. Intel's Core2 processor did not reach mass availability until around August 2006 and the majority of Intel processors available today are still of the old Pentium4 design. It took Intel over 3 years to get a competitor to Athlon64 out the door, and that competitor is still hampered by a Front Side Bus design. Intel is timeling a Direct Connect Like Architecture for 2008/2009. That puts Intel over 5 years behind HyperTransport version 1. HyperTransport itself is already in it's 3rd revision and includes specifications for cables that can connect systems 3 meters away at processor latency.

So Intel is 3 years behind AMD on architecture design, over 5 years behind HyperTransport, and this is Intel's Core (... pun was not really intended here, one of my complaints with Intel's choice of branding) Market. Intel's bread and butter is the Central Processor and the supporting chipsets.

Graphics have been ancillary to Intel's markets for years. If they can't even compete against their Core Competitor(s), what makes us think that they can compete with companies whose entire business's are based on secondary objectives?

I, for one, don't think Intel can.


TriggerTyme report coming up next

Friday, December 08, 2006

HardOCP Steve and I agree... (again) :: Handwriting

now, according to this article I found while reading HardOCP's news, Handwriting is ... quote "irrelevant." Now, like Steve who ran the news on HardOCP, I am skeptical of the validity of the report. Now, I can type a lot faster than I can write. I can type a lot neater than I can write. However, despite using a Computer to type with on a daily basis, I have never forgotten how to pick up a pencil and write the alphabet out, or how to form the letters.

This... report... if you will just reeks of laziness on the part of the students. Writing is pretty much like riding a bicycle. Once you get the patterns down, you really can't forget them. Personally the teachers need to suspend all acceptance of printed materials and accept hand-written only. If the teachers cannot read the papers, automatic F.

I think that would jog the memory cells on the students.

Thursday, December 07, 2006

I think I should be speechless about ActiveX on FireFox

Part of me does not want to even put the link in text... but, without the link, it is kind of pointless.

Somebody is making a plugin to run ActiveX controls on FireFox. Now, I don't know about any of my readers, either from MepisLovers or otherwise, but one of the main selling points of FireFox is that it does not support ActiveX. Even Microsoft admits that ActiveX is a Bad Idea, or at least a Bad Implementation of an idea.

I personally started using FireFox because I got tired of the ActiveX junk that Internet Explorer natively accumulates, and for no other reason. I didn't care about open standards. I didn't care about open code. I didn't care about tabbed browsing. I didn't care about small executables. I didn't care about integrated RSS readers. I was flat out tired of having to run two anti-virus and two anti-spyware programs every day in order to keep the system clean of junk accumulated by ActiveX and Internet Explorer.

Sure, now I care about those things. Now it does make a difference to me. But, the entire reason I moved to using and advocating Open Source software is that I was concerned about security. I wanted to remove as many entry points as possible. Now, we are very much aware of many security analysts out and about writing about how FireFox is no more secure than Internet Explorer. We've heard all about the security comparisons.

Let me put it this way. I can count on one hand the number of compromised and hijacked FireFox browsers that I have had to fix. Matter of fact is, I don't even have to use a hand. Why? Because to this date, I've never actually seen an end user run into a FireFox exploit. Now, I have seen pop-ups get through. I have seen pop-unders. I've seen ads get through. But since I moved to FireFox on windows, the worst browser problem I've seen has been tracking cookies. I can say the same for Opera on Windows.

Now, Internet Explorer, on the other hand, is a completely different story. When I was working for Sitel we received on average 4 calls each, per day, from people with hijacked or compromised Internet Explorer browsers. Now, you figure an average of 60 people taking calls over an 8 hour period, that is/was 240 compromised Internet Explorer systems every 8 hours. That was from our call center alone, never minding what San Diego, Call Tech, Sykes, or WTX-Cox handled (and forgetting the other regional call centers).

After Internet Explorer 7 launched I've heard of numerous reports of peoples whose Windows have been completely been Swedish Chef Borked over by the Microsoft Upgrade. I, personally, have had to deal with 23 different compromises since IE7 launched.

Does that put it into perspective for you on what the real security situation is? Now, I firmly lay the blame for the sad situation with Internet Explorer Security at the feet of ActiveX. I know that at least 20 of the problem IE7's I had were caused due to the "need" for Microsoft to support ActiveX controls from IE6.

Now, when I read that someone is implementing ActiveX controls on FireFox, in ANY form, needs of compatibility is not a phrase that runs through my mind. Gaping Security Hole is. Now, I don't know how we could dissuade the author(s) of the ActiveX plugin to stop, and I understand that there are corporate interests being served that probably would not deploy FireFox (or Opera for that matter), without ActiveX support.

My question that I pose to the Author(s) is this then: Do you really think implementing a process or application, a process or application so dangerous that even the creator of that Application or process gets behind the message that it needs to be gotten rid of, is a good idea? I don't think so.

Monday, December 04, 2006

My Issue with Anandtech | or | why I'm not buying Intel's Kentsfield numbers

By this point in time if you read any news sites at all that have anything related to technology, you probably have heard of Intel's Kentsfield, their Quad-core processor. You probably have also heard amazing performance numbers for it, and probably have been told that it is the highest performing processor you can get.

Excuse me while I call foul on these reports.

Fact is, I'm not buying Kentsfield performance numbers, not by a long shot.

Now, I have several issues with the numbers and performances being bandied about by Intel and Intel sponsored reviewers, and I'm going to start with Intel's Core2 processor. Core2 is why I'm fairly certain that once Kentsfield processors arrive in masse, people are going to find that it is a lemon.

Now, Kentsfield is built on Intel's Core2 processor design, better known as Conroe. It basically fits 4 of the Conroe cores onto one piece of Silicon. Let's review Core2 shall we? Now, if you listen to a lot of review sites out there, say Anandtech for one, you would believe that Intel's Core2 processor is the ultimate processor you can buy.

Not.. really. A lot of reviewers called foul on when Conroe launched. Many noted that the initial benchmarks were designed to fully utilize the massive on board cache of Conroe. Benchmarks showing realistic data sets were far and few between, and real-life use was a mystery.

Reality began to set in when HardOCP took the Core2 processor to the benchmark table. HardOCP strives to avoid using canned benchmarks, and instead tries to record actual game usage. Thier... results... were surprising to many of the Conroe backers. While Core2 was faster than Athlon64, it wasn't by a blow out. Rather, it was only 2-5% faster. The picture HardOCP painted got worse. HardOCP was using the 2.66ghz Core2 and the 2.93ghz Core2 on the tests, and the performance delta between the two was as small as the delta to Athlon64. 2-5%

So, lets do some quick math here, starting with the mhz speeds of 2660 and 2930.

2930 - 2660 = 270

2930 / 270 ~ 11

The 2.93ghz Core2 has about an 11% clock rate advantage over the 2.66ghz version, but the performance wasn't even close for 10% delta.

Now, I personally know that overclocking an Athlon64 2800+ Socket 754 on Biostar's T-Force by 250mhz nets a significant performance advantage. Even overclocking it by just 100mhz nets a larger than 5% performance increase.

I also know that dropping memory timings on the Athlon64 3200+ on Socket 754 from 3, 3, 3, 8 to 2, 2, 2, 5 netted a larger than 5% performance boost. In many cases using ultra-tight memory timings can net anywhere from 15-20% more performance out of an Athlon64 design.

Yet, as Core2 began to propagate out into the market, users have found that dropping memory timings didn't help. Overclocking does not help either. Once users begin to rise above the 2.6ghz mark, Core2 just does not scale with performance.

Many users agreed early on that the problem was Intel's connection technology. Intel is still using the Front Side Bus design to connect their chips to the rest of the system...

and that's why I think the Kentsfield / QuadCore performance numbers are manure. Boosting the FSB connection speed is only a band-aid on an overall larger problem. Core2 is saturated as is trying to move information in and out on Dual Core Systems. Does Intel really expect us to believe that QuadCore, built on the same architecture, can really be so powerful?

Core2, once it got out into the open, has proven to be a worthy competitor to the Athlon64. But the Athlon64 owner who pays attention to their choice of memory and their memory timings are not going to be outrun by Core2.

Intel Apologists no doubt can take the stance that, well, even if the performance numbers are hyped, it will be the most powerful processor period because of the massive caches. It still will be more powerful, so nyah.

The... counter argument... is that power isn't everything. Consumers also have to figure cost. Back when the AthlonX2 4800+ had just come out I ran a post called Rig Recommendations on a Planetside forum. Since then, I've passed the Rig Recommendations article around and it now sits on Mepisguides. One of the particular points I brought up in Rig Recommendations, and points I continued to bring up on forums, was that gamers had to factor in the cost of their processor compared to the power output.

I pointed out that each price point of an Intel Processor, the AMD processor of similar power output was cheaper. If you compared prices directly, the AMD processor was always much more powerful than the Intel Processor. To quote myself:
The Intel 3.8ghz 32bit costs $590

The Intel 3.6ghz 64bit also costs $590

The socket 754 Athlon64 3700+ that outruns them both costs $254
Now, being honest here, AMD hasn't exactly held the "Ultimate" performance crown that often. Remember the AthlonXp? Remember how the Intel 3.2ghz Pentium4 processor ran rings around the AthlonXp 3200 processor?

Fact is, while AMD hasn't always held the performance crown, they've been very consistent at holding the price crown. I still remember explaining to somebody in BurgerKing back in 2002 why he wanted an AMD processor and not an Intel design, and I explained it to him on price. Yes, if he had the money to get to get the most powerful system available, Intel was going to be his choice. But if was stuck on a budget, Intel wasn't even a consideration.

The same holds true today. If you want to build the most powerful system you can for a majority of applications, you are going to get a Core2 system, and now a Kentsfield. But, if you actually have been placed on a budget, or say a fixed incoming, AMD's processors deliver much better bang per buck.

Now, bringing it back to Anandtech. I used to read Anand's site on a daily basis a little over a year ago I think. Anandtech tended to be pretty well balanced. Till, around after Core2 launched, I began to get the feeling that all the motherboard and CPU articles were selling Intel Products. I didn't see any investigations into the FSB issues with Core2. I haven't seen any parallels on how Conroe's Saturation of the FSB mimics the Barton Saturation of the FSB. I haven't seen anything about how Intel will need to move to a Direct Connect Like Architecture in order for newer processors to scale up in performance.

Instead, I read the article about AMD's 4x4 platform and I'm greeted by expressions about how Core2 dominates peformance. Em. No. It doesn't.

Over time I've watched as Anandtech has slowely splattered egg on Anand's face with... questionable content... on the site.

I really wonder, when users actually start getting hands on with Kentsfield, how big is the backlash going to be? How many people are going to realize that they've been mislead?

Scoring high in PCMark is one thing. Being able to run City of Heroes for 8 hours during a Hamidon Raid, or hosting a Web server under *nix for years on end are quite different. And I personally think that Kentsfield will be a lemon once it hits real work.