Thursday, December 28, 2006

Zune only has 2% of the market? Who would have guessed.

Checked Dailytech and one of the current headlines is about the success of the Ipod yet again during Christmas time : http://www.dailytech.com/article.aspx?newsid=5493

Microsoft's Zune also found space in the article, but I'm sure a lot of readers are familiar with the adage that a bad promotion is just as good as a good promotion. The idea being that even if you create a ruckus about a product through bad advertising, people who report on your bad advertising have to mention your product by name anyways, which results in people hearing about your product anyways. Rockstar Games has used this concept to great effect with Grand Theft Auto, Bully, and other games. News agencies complaining about the games have to refer to the Rockstar Developers and refer to the games by their direct names, which means that the average person watching FoxNews is going to hear about Grand Theft Auto. They may not understand, but the name is still placed out there. This is also how Jack Thompson got such prominence, if you were asking me. But I digress.

There appearently is some surprise within the Music Industry that the Zune has taken a nosedive, not that Steve Ballmers infamous Squirt terminology was helping the system. Appearently a lot of people within the Music Industry were supposing that the Zune was going to be a success. Afterall, it was selling right next to the Red Hot Xbox 360 and Nintendo Wii systems right in Gamestop. It surely is/was a better media player than Sony's MYLO and Playstation Portable system. Why would it fail?

I, for one, do not find the Zune's performance surprising at all. The Zune is basically a repackaged Toshiba Gigabeat ( http://www.gigabeat.com/ ), a player that almost nobody has ever heard of, also for good reason. Now, I could be wrong about this, but I wasn't able to find a positive review about the Toshiba Gigabeat, most of the reviews I've seen lumping it along with Dell's failed Ditty.

That being said, it is plausible that the Zune could pass One Million Units shipped by June 2007. However, there is a key word in there that I think a lot of people are skipping over, and that is the word shipped . Now, we've already been in this mess before with Sony and the original Playstation, not to mention the Playstation Portable. Shipped units are not sold units. We've also had this before with the original Xbox from Microsoft as well, which used Shipped Units in the US instead of sold units to give a perceived higher sales numbers than the competing GameCube from Nintendo. While the Xbox eventually did take a substantial lead in the US, many analysts still try to use the shipped numbers instead of retail sales numbers.

The catch is that Units that are shipped from the Warehouse are not sold units. Fact is, Sony and Microsoft have a long history of embellishing sales numbers by either mis-reporting the numbers, or by changing the way the numbers are presented. I, personally, already hammered Sony on that back on Monday 23rd of January 2006 at 08:30:01 AM on Gamenikki's blog. The Sony PSP Market dissection is reprinted here : http://www.gamenikki.com/g3/features/Jason.php?id=...

To quote myself

The question is why? Why would Sony not expect to sell as many PSP units in 2006 as it did in 2005? We'll get to that later, but for now, lets take a look at another piece of the puzzle.

At this time there is no doubt about the domination of the PS1 and the PS2 in the gaming world. The PS1 is by far the worlds most popular console, ever, and the PS2 is by far the most popular of the current generation consoles. It was not always like that though. During the late 1990's Sony was actually known to pull many tricks in order to make the sales of the Playstation appear to be better than sales of competing consoles, such as the Nintendo 64. Sony was known for counting consoles returned for repairs and shipped back out as new consoles. Another one of the tactics used back then is that Sony would list the number of Playstation units shipped out as the number of units... um. Well, the number of units sold.

Sounds suddenly familiar doesn't it. Sony didn't sell 10 million plus PSP units, but they did ship that many. Now, what I want to ask, is what the average user going to think when they browse through press listings or blurbs on a news site. Those are such big numbers being tossed about, and the idea of a million anything to an average web page browser is indeed a lot. What exactly looks more competitive? 9 million plus handhelds sold versus 13.5 million handhelds sold, or 10 million plus handhelds versus 13.5 million handhelds
The fact is, Microsoft isn't promising that one million units will have been sold by June 2007. All they promise that one million units will have shipped from the factory. Now, if Microsoft wants to step forward and say how many Zune's they actually expect to sell to end users by December 2007, I think that number is going to be a few hundred thousand short of a million.

There are several reasons why the Zune isn't selling, the least of which being the terminology associated with the unit, the restrictions placed on sharing music, and the ugly design. The fact is, the Zune competes directly against "Plays for Sure" devices. Microsoft, as far as I see it, turned on their own sales partners and is reselling a "Plays for Sure" device without the "Plays for Sure" support. In addition, the Portable Music player field is littered with entities that have come and gone. Sony who used to dominate with the Walkman can't beat Apple. RIO, who created the market, is where now? Creative, who owns the high-quality home consumer sound market for computers, hasn't been able to dent Apple.

So why would analysts believe that Microsoft would be capable of competing where the people who created the markets failed? Money isn't everything. Microsoft may have billions, and they may have their Tax on new computer systems, but, media players aren't locked into a personal computer.

Now, i'm tempted to continue on about how Microsoft can't leverage their illegal monopoly. At this point, I'm going to direct you www.roughlydrafted.com

I'd start here :
http://www.roughlydrafted.com/RD/Home/D2B2568D-4F51-4808-BB9E-07B800E6D44D.html

Although this page gives a more convenient listing of all the articles.
http://www.roughlydrafted.com/RD/RDM/1E546447-B703-4C61-8CC4-68849DF3C9AC.html

Btw, slight word of Warning. The writer of the Roughly Drafted Articles is very Pro-Apple, and does not really effectively cover or view Open-Source. Cited examples include that Linux isn't ready for the desktop, something that Mepis disproved back in 2003, and Ubuntu has helped enforce.

The writer also attempts to make political jokes about the US Government. As best I can figure, it does not appear that the Roughly Drafted writer actually ever took Government 101 in any high school. So, do keep a keg of salt on hand while reading through the articles there.

Wednesday, December 27, 2006

Chewing out Dailytech responders, one idiot at a time.

yes, Trigger Tyme and Paintball has been pushed back. I've gotten some emails that... um. Well. Lets just say that I'm not wanting to get in the middle of whatever tiff is going on between Trigger Tyme and an apparent competitor.

Anyways, the subject of today's post is yet another person repeating an oft-disproved lie about Microsoft products. So, here we go. This is the original post I made on Dailytech, in response to Vista Activation getting cracked again.

I found this line to be, well, hilarious:
Despite Microsoft’s best efforts to shut down this latest exploit, it does leave us wondering jus how secure this new operating system if it can be poked at and prodded this early after release.
Many of the security features were already being cracked a year ago or more on the beta's that were going out. Over the past 6 months as the final release firmed up, the known exploits didn't stop. We knew literal months ago that Vista was no more secure than a properly configured and hardware firewalled protected Windows XP or Linux OS.

The end fact is, it's still going to take work to protect any computer system, any OS. Vista has changed nothing in a System's Administrator job, end of story. We don't have to wonder. We know that inherently, any Microsoft product will be less secure out of the box than it's competitor. We know that people will still be earning paychecks fixing that.

Someone came along shortly after that and ran this comment:

Only less secure because the "competitors" have 3% market share, therefore nobody gives a (manure) about cracking them.
Oh... yeah. The old Market Share equals more attacks argument. Lets go ahead and eliminate this bunch of marketing mud right now.

Slight problem with that Sharky.

Mind telling me what Web-server powers the majority of Web-sites in use today? And what Operating system is the basis for the most servers in use today?

I'll answer for you: Apache and Linux.

Despite having a majority in server usage for games, banking, shopping, web-pages, ftp, or anything else where a server is used, you don't hear about major Zero-Day or Near-Zero-Day exploits for Linux or Apache.

The line that Microsoft products are less secure because they have a majority in the consumer market is inherently based on several false assumptions.

The first assumption is that Microsoft products are easier to crack. This is based on the sheer number of exploits that make appearances in the wild. However, there is very little code documentation available for Microsoft software. With Linux and Apache there is ample documentation available, and even beginners in coding can grasp the way the programs actually work. The fact is, it is easier to modify a program that is fully documented rather than one that is not. If malicious crackers wanted to make SoBig and IloveYou look like childs play, they would have attacked Linux and Apache servers.

The second assumption is that all of the computers sold by Microsoft-Licensed OEMS remain with a Microsoft OS. This is partially based in the so quoted "Microsoft Tax." Nearly every complete computer system sold on the market ships with a Microsoft OS. However, even a casual browsing of Distrowatch, or casual browsing of the forums for Ubuntu, Mepis, PClinuxOS, and other versions of Linux show something startling. There aren't just hundreds of posts from people reporting they've "switched", there are several hundred thousand posts. The problem, if you want to call it that, is that there is no reliable metric for determining what Operating System somebody is running on their computer. So, while it's safe to say that Microsoft probably has greater than 50% market share, assigning any percentage above that to Microsoft ignores ample evidence to the contrary.

The third assumption is that all cracks are equal. Now, I have not personally done this, but I do know people have taken the time to dig down through Symnatec's reports on Windows and Linux vulnerabilities and existing exploits for those vulnerabilities. Now, we've all heard the marketing from Microsoft that it is more secure than Linux, and we've all heard from Linux developers that Microsoft isn't telling the whole story. Symnatec bears that out. We've all heard about the penetration rates for Windows virus's. Hundreds of computers are affected each month by a new vulnerability. Spyware and Adware code is on a rampage. However, when looking at Open-Source OS's like Linux, and Applications like Apache, over 70% of the reported viral exploits available only have been found in Symnatec's testing labs, with a penetration of 2 computers.

So, not all cracks are equal. Not all exploits are going to do the same damage, or initialize in the same manner. There are several people who do care about cracking Linux and Apache. They just have paying Jobs from the likes of IBM, Novell, RedHat, and Ubuntu.

Now, please stop the F.U.D. that Microsoft products are less secure because of Market Share. Confirmed majority marketshare is only true in one particular market, and does not apply or indicate the industry as a whole.

Something else popped into my mind while I was posting this into Blogger.

Vista isn't even out yet for the average user. I can't go down to Best Buy and get a copy of Vista off the shelf right now. Most manufacturers are not even selling HomePC's with Vista right now, almost all holiday models are all Windows Xp.

So, right now, Vista doesn't even have .01% of the home market, and it's server penetration rate is far less. There is also a known issue with SQL Server 2005 Express, which does not run on Vista.

So, right now, Vista has the overall penetration of say, BeOS, RISC-OS, or maybe Amiga. This brings the market-share ratio F.U.D. into perspective. Vista does not even have the market share of BeOS, and it already has more major published vulnerabilities and exploits than either 3 of it's market-share peers.

Friday, December 15, 2006

Triggertyme... not yet :: 95% of who supports Novell Microsoft deal?

The torrenza post had been sitting in que for a few days, I wanted to toss it out after TriggerTyme. But, while writing the TriggerTyme report I noticed I was creating both a "report" of the weekend and a "Why I play Paintball" entry.

So, that's now been split into two entries...

While I finish those up them, I wanted to comment on this story from Cnet:

http://news.com.com/2061-10795_3-6142793.html

Microsoft is touting that in a survey of 201 computing industry decision makers, 95% supporting the agreement. Now, back on May 8th of 1998 Illiad's Userfriendly comic strip commented on Microsoft's survey about whether or not Windows 98 should be stalled by the goverment.

http://ars.userfriendly.org/cartoons/?id=19980508

Userfriendly commented that Microsoft had 835 directors and Vice Presidents, which would have given the required 5 to 1 ratio out of 1002 respondents. Back then, it made a nice joke, but the point of the comic strip still stands.

Microsoft does not clarify who was polled, or for which companies. All Microsoft states is that 201 computing industry decision makers. What is the metric for determining how these decision makers were polled. Was IBM polled? Was Sun Polled? Was Red Hat Polled? Was Open Office polled? Were FireFox developers Polled? Were Apache developers polled? Were the SAMBA developers polled?

The quick answer is, no. None of the developers who have made industry changing decisions in the past 10 years were polled.

So who did Microsoft poll then? Corel? SCO? SGI? Their own employees? Novell's employees?

Until Microsoft comes forward with clarification on who was polled, or agrees to run a new poll and explicitly ask the above mentioned companies and developers their stance on the Novell/Microsoft agreement, we should file this as shameless promotion and ignore it.

Sunday, December 10, 2006

Torrenza: Why AMD may not have to introduce a new processor

Over some of the past entries I've commented on Intel's Conroe/Core2 design inability to scale performance with processor speed. I've also commented on the Athlon64 design which does scale predictably has the speed increases.

However, AMD has yet to introduce a direct competitor to the ultra-high end Conroe/Core2 designs, such as Kentsfield, and nor does AMD seem to be in a hurry to do so. Many AMD fanaticts seem to be convinced that AMD will answer Intel Shot for Shot, and fully expect Quad-Core to regain the performance crown. Many analysts seem to believe that AMD will release another version of the K8/Hammer design that will make up the few percentage points lost to Core2.

AMD may, but they don't have to. Back in 2001 and 2002 AMD was beginning to show off the architecture they called Hammer, a fully 64bit processor with amazing potential. Early tests showed that Hammer was putting out performance numbers on par with AMD Thunderbird and Intel Willamette and Northwood designs with a 1ghz clock advantage.

When the AthlonXp 3200 launched, AMD knew that the Xp line was running out of steam. The Intel 3.2ghz Pentium4 was the clear winner in a majority of benchmarks. Rather than continuing to answer shot for shot with Intel, AMD pushed Hammer, the Athlon64 processor. Remember what Intel initially said about x86-64? That 64-bit was useless and that nobody wanted or needed it? And what actually happened as time went on? Intel only began x86-64 support with Prescott in 2005, 2 years after AMD, and even then applications compiled for x86-64 specs under Linux wouldn't run with Intel's version.

AMD redefined the battlefield rather than continuing an arms race of who could get the fastest clocks out of the processor.


Just the same with Core2 and Athlon64. AMD has already been in this position before. Intel, while slow, is capable of competing. The Athlon64 would not always be the most powerful processor available, AMD has never said otherwise. Just as the Pentium4 3.2ghz went beyond the AthlonXp 3200, something would go beyond Athlon64.

And again, AMD has redefined the battlefield, this time with the processor inter-connection than with a new processor design.

Torrenza is basically cache coherent Hypertransport, allowing for processors or chipsets to be directly aware of what other processors or chipsets are doing. By opening up CC-HyperTransport, AMD can link their processors directly with other processors that may not be of Athlon64/Hammer design. As is, Sun Microsystems, HP, and IBM are behind socket compatibility, placing their processors into AMD sockets. Sun is known for it's Sparc Architecture and the most recent Niagara design. HP has the PA-RISC architecture and what is left of the Alpha Architecture. IBM is one of the primary backers of the PowerPC architecture, and while Sony may be failing with Cell in the Playstation3, the Cell design is not one that can be dismissed.

Now, let me try to put some context here:

In the common computer today it is taken for granted that not all processors process in the same manner. Some processors are better at some tasks than others. AMD and Intel typically focus on the Central Processor, the brains behind the computer. The central processor collects, collates, arranges, and directs the computers other systems. ATi and Nvidia typically focus on graphics, designing high performance add-in cards to increase visual display quality. Creative markets a sound processor that has a focus on producing the best sound possible. AGEIA markets an add-in card that removes physics computations from the central processor. BigFoot makes and markets the Killer NIC, a hardware networking card.

There is no question that each of these different dedicated hardware parts are better in terms of performance than integrated parts. The difference may be noticeable in only a few cases, but it is there.

Now, given that we take for granted that a computer can have a multitude of different tasks that are each served best by one architecture, is it really too much of a stretch to think that the same applies to the Central Processor?

Sun's Niagara design is a monster in several server tasks, but even with the most powerful video card in the world, don't expect it to be able to run Unreal Tournament 1999. It is not built to run that kind of code.

What Torrenza enables is multi-central-processor systems that don't have to be one architecture. Lets say that somebody running a webserver may also want to host a game server as well. Today, that means having to buy two different computer systems. One that is tuned to serve web-pages and one that is tuned to serve games. With Torrenza, that person could buy one server, equip one socket with an Athlon64, and the other socket with a Sparc Niagara design processor, and be able to do both.



That is how the battlefield is being redefined. AMD's Athlon64 is not the be all/end all processor design. There are some tasks that it just will never be as good at. With Torrenza though, consumers can use a completely different processor which can be stronger in areas that the Athlon64 is not strong in.

This is why AMD doesn't have to rush another processor architecture out the door, or ramp the speed up on the existing Athlon64 designs. A single server with an Athlon64 processor and a Cell processor inside would render Conroe processor designs irrelevant. Intel can have the single application performance crown, that won't help when the competition is able to answer Conroe processor designs with equal performance in similar tasks, and also give a devastating performance difference in tasks that the Conroe architecture is not designed for.



Now, I do need to state that I am intentionally trying to avoid the Operating System question in relation to Torrenza. There is only one Operating System available that could possibly pull this merger of architectures off, and its not from Microsoft. It may also be years before we actually see anybody putting anything like this together. However, consider this:

Intel:
took 2 years to add x86-64 support into Pentium4
is 3 years behind AMD on delivering comparable x86-64 performance
is at least 5 years behind on a Direct Connect Like Architecture

By the time Intel gets their Direct Connect Like Architecture out the door, Torrenza could already be fulfilling the scenario outlined here.

Firefly MMO Flop in Progres |or| Intel Larrabee Flop in Progress

Recently two bits of information popped up on some news feeds that have caused a stir. The first is that Multiverse has licensed FireFly/Serenity and will be using thier in-house MMO engine in order to make a FireFly MMO-RPG. There have also been wraps taken off of Intel's Larrabee project which aims to take over Nvidia and AMD/Ati as the provider of high-end graphics cards.

Wired News originally reported on Multiverse's FireFly acquisition here:

Apparently Multiverse intends to use FireFly as a highlight and high profile example of what their engine can do. Now, when I went to http://www.multiverse.net/ I noticed a blurb on the side that states this:

TerraNova
November 9, 2006
Multiverse > SL > WoW?
Which is the future of virtual worlds? Experts or the masses? Multiverse appears to be hedging its bets by allowing its tool set to be used by either.

That is what starts ringing bells that something is wrong. Second Life is only able to handle a few thousand people at a time, is very slow loading, and generally looks like a lab project from somebody learning to make 3D objects. World Of Warcraft can handle a few million people at a time, is much faster on loading, and looks like a professional grade graphics project.

As you poke around the Multiverse site, something else jumps out. Multiverse does not actually make any games. Rather, they make a quoted Technology Platform for games. In otherwords, all Multiverse does is make a GAME ENGINE and expects to license the game engine out like ID Software (Doom3 Engine), Epic Software (Unreal Engine 3), LithTech (Jupiter EX), or CryTek (Cry Engine).

Forgive me if I think that Multiverse is already on the road to failure with their objective. The problem is, all Multiverse has to show off their engine is a Demo World that comes across as a cheap copy of Second Life. Most other Engine Providers generally have a killer application that show off what the Engine can do.

ID Software's Doom3 engine is demonstrated by Doom3 and Quake4, both high selling titles with massive franchise history. The games Massive Multiplayer capabilities are being shown off by the next Enemy Territory game.

Epic Software backs Unreal Engine 3 up with Gears of War and Unreal Tournament 2007. As with Unreal Tournament 2003, a headline game shows off what the new engine is capable of. With the expansion of Assault into Conquest, Unreal Tournament 2007 also shows off the Engines capability to stream maps, big news for MMO Type Games.

LithTech works closely with spun-off developer MonoLith on the Jupiter and Jupiter EX Engine. Unlike other high profile Engines, LithTech's premier title does not carry the name of the Engine. On the other hand, I don't think Jupiter or EX could be worked into F.E.A.R., one of the few FPS's I ever felt was worthy of a perfect 10.

CryTek made a huge splash with Far Cry and Crysis, titles that set new records for how games should look, as well as potentially justifying spending over $500 on a graphics card to get those looks.

The point here here is that Multiverse does not have a killer application on tap to demonstrate what the Multiverse engine can do. Rather, what Multiverse is trying to do is get someone else to build the killer application. Their plan for the FireFly MMO is to hire a development team to bring the world to life. Unfortunantly, the plan has a fatal flaw in it:

No one is available to hire who can program a Complex MMO.

Think about for a second. The MMO-Market is flooded with development teams. Consider NC-Soft and Webzen for a minute and the amount of developers they have. Consider Sony Online Entertainment and Blizzard for a minute. These companies are already aggressively highering mod makers, coders, and college students who demonstrate skills applicable to the MMO markets.

GuComics (http://www.gucomics.com) runs a comic panel every so often about The Zapper, commenting on deaths within the gaming industry. On August 16th the Zapper Panel listed several companies/games who had gone under. The MMO market is just littered with failed projects that had massive marketing budgets and formally high player counts. One of the most common problems is that the development teams simply could not put together a game that a lot of people wanted to play, the same problem NC-Soft has right now. NC-Soft has a lot of great games such as Auto Assault. Is it good enough for me to pay every month for? No.

To put this in even further perspective, consider NC-Soft's upcoming Tabula Rasa, found over here: http://www.playtr.com/index.html Tabula Rasa has gained a lot of interest because of the person who is creating the game, Richard Garriot, AKA Lord British from Ultima. A lot of players eagerally await the game because they know what Richard Garriot has done in the past. Also consider City Of Heroes from NC-Soft, headed currently by Matt Miller, the creator and player of the character named Positron. If Matt Miller were to up and leave Paragon City, I'd be tagging right along behind because I love CoH. I, personally, would have never looked at SOE's Planetside game if it were not for the fact that the creators behind Tribes were bringing the game to life.

The fact is, once you know who a good game designer is, you'll be more likely to follow that game designer around if they move to another game.

Now, keeping that in mind, start putting the picture together. Multiverse is trying to get a development team together to produce an MMORPG for a high profile property. The problem is, the only people available to really pull together on an MMO Project are either people not yet in the MMO industry, or those who have created games that failed in the MMO markets. People who have the talent to program great games are going to be actively courted by several different developers with high profile projects. Lets say I get approached by NCSoft or Blizzard to be a writer, and I also get approached by some unknown company to to write as well for a long term contract. Which one would I choose?

I'd go with the company that has an established track record.

let me lay it out in a shorter form

Multiverse has
-no Killer Application on hand
-no well known developers on hand
-no reliable source of developers to compete with NCSoft, Webzen, SOE, and Blizzard

The phrase, Cart Before the Horse comes to mind when looking at what Multiverse is trying to do. The good news for Multiverse is that they aren't the only ones who have run into this problem.

Intel also has a problem with putting carts before horses, and that would be Larrabee. The-Register reported on Larrabee over here: Basically, Intel feels that they can take the high performance video accelerator market away from Nvidia and AMD/ATi. Like I think Multiverse is headed for failure, I think Larrabee is also in trouble.

The problem is that like game development, the pool of people available who are capable of designing high performance graphics is relatively small. One of the many reasons cited by ATi over the past several years about shying away from Open-Sourcing their graphics drivers is that everybody who could understand the drivers was already employeed by ATi, or employeed by Nvidia. Looking at the market, ATi is/was probably right about that. Like the MMO markets, ATi and Nvidia have not been the only players, they've just been the best so far.

Consider 3DFX, makers of the popular Voodoo Series of cards several years ago. Despite owning the performance market for games during the time of Windows 95 and 98, 3DFX is now property of Nvidia. Despite having a great API for graphics and fantastic software support, hardware misteps took 3DFX out of the market.

Consider XGI, makers of Volari. XGI was born when Silicon Intigrated Systems purchased Trident and spun off the Xabre line of graphics accelerators. Xabre was decently priced with decent drivers, but really could not make a dent against their competitors. XGI's hardware looked great on paper, but driver support was never good enough to allow the cards to function. XGI was aquired by ATi.

Consider S3, backed by Via. S3's Chrome cards have barely carved out a niche among Home Theater Enthusiasts. Despite decent pricing and decent driver support, they have not been able to crack open more than sliver of the graphics market.

What about former graphics great Matrox? Their Parhelia chip offered fantastic triple screen support, the performance of a Geforce4 TI 4200, at over 3 times the price. That was back in 2003, and since then Matrox has been near silent in GPU discussions.

PowerVR was another high profile, thier last major design win going into the Dreamcast. Did anybody actually buy a Kyro card though?

The landscape for graphics development is an extremely competitive one. While there have been many who have approached the graphics accelator markets over the years, competition for the add-in market drove several out of business or into mergers. ATi and Nvidia aggressively recruit anybody who has talent with drivers or hardware design, and have done so for several years. Even when some of the purchases didn't make any sense to begin with, such as the 3DFX and XGI purchases, the results spoke for themselves. Nvidia brought back SLI, and XGI's hardware designers are no strangers to multi-chip setups either.


So, lets put this in perspective then. ATI and Nvidia have a long history of spending millions of dollars to recruit anybody with talent in their industries.

Intel who has never really left the graphics market, is openly laughed at and mocked by game developers (Mark Rein for example) for their horrible integrated graphics.

Larrabee is a great idea, sure. Competition needs to exist. But lets be honest. If you want to work in graphics, either writing drivers or designing hardware, and you get approached by Nvidia, ATi, and Intel, who would you go with? Would you go to work for the companies with a reputation for high performance graphics? Or would you go to work with a company known for making Integrated Graphics that can barely run Unreal Tournament 1999.

Again, anybody who is competive or talented in the field is probably going to go for the companies with proven track records.

Now, maybe Intel can pull a Rabbit out of it's hat. Maybe they can make a competitive graphics chip based on General Purpose computing.

Keep this in mind though. AMD launched and sold Athlon64 processors in bulk back in 2003. Intel's Core2 processor did not reach mass availability until around August 2006 and the majority of Intel processors available today are still of the old Pentium4 design. It took Intel over 3 years to get a competitor to Athlon64 out the door, and that competitor is still hampered by a Front Side Bus design. Intel is timeling a Direct Connect Like Architecture for 2008/2009. That puts Intel over 5 years behind HyperTransport version 1. HyperTransport itself is already in it's 3rd revision and includes specifications for cables that can connect systems 3 meters away at processor latency.

So Intel is 3 years behind AMD on architecture design, over 5 years behind HyperTransport, and this is Intel's Core (... pun was not really intended here, one of my complaints with Intel's choice of branding) Market. Intel's bread and butter is the Central Processor and the supporting chipsets.

Graphics have been ancillary to Intel's markets for years. If they can't even compete against their Core Competitor(s), what makes us think that they can compete with companies whose entire business's are based on secondary objectives?

I, for one, don't think Intel can.

****


TriggerTyme report coming up next

Friday, December 08, 2006

HardOCP Steve and I agree... (again) :: Handwriting

http://www.thestar.com/NASApp/cs/ContentServer?pagename=thestar/Layout/Article_Type1&c=Article&cid=1165272610506&call_pageid=968332188492

now, according to this article I found while reading HardOCP's news, Handwriting is ... quote "irrelevant." Now, like Steve who ran the news on HardOCP, I am skeptical of the validity of the report. Now, I can type a lot faster than I can write. I can type a lot neater than I can write. However, despite using a Computer to type with on a daily basis, I have never forgotten how to pick up a pencil and write the alphabet out, or how to form the letters.

This... report... if you will just reeks of laziness on the part of the students. Writing is pretty much like riding a bicycle. Once you get the patterns down, you really can't forget them. Personally the teachers need to suspend all acceptance of printed materials and accept hand-written only. If the teachers cannot read the papers, automatic F.

I think that would jog the memory cells on the students.

Thursday, December 07, 2006

I think I should be speechless about ActiveX on FireFox

Part of me does not want to even put the link in text... but, without the link, it is kind of pointless.

http://www.iol.ie/~locka/mozilla/plugin.htm#introduction

Somebody is making a plugin to run ActiveX controls on FireFox. Now, I don't know about any of my readers, either from MepisLovers or otherwise, but one of the main selling points of FireFox is that it does not support ActiveX. Even Microsoft admits that ActiveX is a Bad Idea, or at least a Bad Implementation of an idea.

I personally started using FireFox because I got tired of the ActiveX junk that Internet Explorer natively accumulates, and for no other reason. I didn't care about open standards. I didn't care about open code. I didn't care about tabbed browsing. I didn't care about small executables. I didn't care about integrated RSS readers. I was flat out tired of having to run two anti-virus and two anti-spyware programs every day in order to keep the system clean of junk accumulated by ActiveX and Internet Explorer.

Sure, now I care about those things. Now it does make a difference to me. But, the entire reason I moved to using and advocating Open Source software is that I was concerned about security. I wanted to remove as many entry points as possible. Now, we are very much aware of many security analysts out and about writing about how FireFox is no more secure than Internet Explorer. We've heard all about the security comparisons.

Let me put it this way. I can count on one hand the number of compromised and hijacked FireFox browsers that I have had to fix. Matter of fact is, I don't even have to use a hand. Why? Because to this date, I've never actually seen an end user run into a FireFox exploit. Now, I have seen pop-ups get through. I have seen pop-unders. I've seen ads get through. But since I moved to FireFox on windows, the worst browser problem I've seen has been tracking cookies. I can say the same for Opera on Windows.

Now, Internet Explorer, on the other hand, is a completely different story. When I was working for Sitel we received on average 4 calls each, per day, from people with hijacked or compromised Internet Explorer browsers. Now, you figure an average of 60 people taking calls over an 8 hour period, that is/was 240 compromised Internet Explorer systems every 8 hours. That was from our call center alone, never minding what San Diego, Call Tech, Sykes, or WTX-Cox handled (and forgetting the other regional call centers).

After Internet Explorer 7 launched I've heard of numerous reports of peoples whose Windows have been completely been Swedish Chef Borked over by the Microsoft Upgrade. I, personally, have had to deal with 23 different compromises since IE7 launched.

Does that put it into perspective for you on what the real security situation is? Now, I firmly lay the blame for the sad situation with Internet Explorer Security at the feet of ActiveX. I know that at least 20 of the problem IE7's I had were caused due to the "need" for Microsoft to support ActiveX controls from IE6.

Now, when I read that someone is implementing ActiveX controls on FireFox, in ANY form, needs of compatibility is not a phrase that runs through my mind. Gaping Security Hole is. Now, I don't know how we could dissuade the author(s) of the ActiveX plugin to stop, and I understand that there are corporate interests being served that probably would not deploy FireFox (or Opera for that matter), without ActiveX support.

My question that I pose to the Author(s) is this then: Do you really think implementing a process or application, a process or application so dangerous that even the creator of that Application or process gets behind the message that it needs to be gotten rid of, is a good idea? I don't think so.

Monday, December 04, 2006

My Issue with Anandtech | or | why I'm not buying Intel's Kentsfield numbers

By this point in time if you read any news sites at all that have anything related to technology, you probably have heard of Intel's Kentsfield, their Quad-core processor. You probably have also heard amazing performance numbers for it, and probably have been told that it is the highest performing processor you can get.

Excuse me while I call foul on these reports.

Fact is, I'm not buying Kentsfield performance numbers, not by a long shot.

Now, I have several issues with the numbers and performances being bandied about by Intel and Intel sponsored reviewers, and I'm going to start with Intel's Core2 processor. Core2 is why I'm fairly certain that once Kentsfield processors arrive in masse, people are going to find that it is a lemon.

Now, Kentsfield is built on Intel's Core2 processor design, better known as Conroe. It basically fits 4 of the Conroe cores onto one piece of Silicon. Let's review Core2 shall we? Now, if you listen to a lot of review sites out there, say Anandtech for one, you would believe that Intel's Core2 processor is the ultimate processor you can buy.

Not.. really. A lot of reviewers called foul on when Conroe launched. Many noted that the initial benchmarks were designed to fully utilize the massive on board cache of Conroe. Benchmarks showing realistic data sets were far and few between, and real-life use was a mystery.

Reality began to set in when HardOCP took the Core2 processor to the benchmark table. HardOCP strives to avoid using canned benchmarks, and instead tries to record actual game usage. Thier... results... were surprising to many of the Conroe backers. While Core2 was faster than Athlon64, it wasn't by a blow out. Rather, it was only 2-5% faster. The picture HardOCP painted got worse. HardOCP was using the 2.66ghz Core2 and the 2.93ghz Core2 on the tests, and the performance delta between the two was as small as the delta to Athlon64. 2-5%

So, lets do some quick math here, starting with the mhz speeds of 2660 and 2930.

2930 - 2660 = 270

2930 / 270 ~ 11

The 2.93ghz Core2 has about an 11% clock rate advantage over the 2.66ghz version, but the performance wasn't even close for 10% delta.

Now, I personally know that overclocking an Athlon64 2800+ Socket 754 on Biostar's T-Force by 250mhz nets a significant performance advantage. Even overclocking it by just 100mhz nets a larger than 5% performance increase.

I also know that dropping memory timings on the Athlon64 3200+ on Socket 754 from 3, 3, 3, 8 to 2, 2, 2, 5 netted a larger than 5% performance boost. In many cases using ultra-tight memory timings can net anywhere from 15-20% more performance out of an Athlon64 design.

Yet, as Core2 began to propagate out into the market, users have found that dropping memory timings didn't help. Overclocking does not help either. Once users begin to rise above the 2.6ghz mark, Core2 just does not scale with performance.

Many users agreed early on that the problem was Intel's connection technology. Intel is still using the Front Side Bus design to connect their chips to the rest of the system...

and that's why I think the Kentsfield / QuadCore performance numbers are manure. Boosting the FSB connection speed is only a band-aid on an overall larger problem. Core2 is saturated as is trying to move information in and out on Dual Core Systems. Does Intel really expect us to believe that QuadCore, built on the same architecture, can really be so powerful?

Core2, once it got out into the open, has proven to be a worthy competitor to the Athlon64. But the Athlon64 owner who pays attention to their choice of memory and their memory timings are not going to be outrun by Core2.

Intel Apologists no doubt can take the stance that, well, even if the performance numbers are hyped, it will be the most powerful processor period because of the massive caches. It still will be more powerful, so nyah.

The... counter argument... is that power isn't everything. Consumers also have to figure cost. Back when the AthlonX2 4800+ had just come out I ran a post called Rig Recommendations on a Planetside forum. Since then, I've passed the Rig Recommendations article around and it now sits on Mepisguides. One of the particular points I brought up in Rig Recommendations, and points I continued to bring up on forums, was that gamers had to factor in the cost of their processor compared to the power output.

I pointed out that each price point of an Intel Processor, the AMD processor of similar power output was cheaper. If you compared prices directly, the AMD processor was always much more powerful than the Intel Processor. To quote myself:
The Intel 3.8ghz 32bit costs $590

The Intel 3.6ghz 64bit also costs $590

The socket 754 Athlon64 3700+ that outruns them both costs $254
Now, being honest here, AMD hasn't exactly held the "Ultimate" performance crown that often. Remember the AthlonXp? Remember how the Intel 3.2ghz Pentium4 processor ran rings around the AthlonXp 3200 processor?

Fact is, while AMD hasn't always held the performance crown, they've been very consistent at holding the price crown. I still remember explaining to somebody in BurgerKing back in 2002 why he wanted an AMD processor and not an Intel design, and I explained it to him on price. Yes, if he had the money to get to get the most powerful system available, Intel was going to be his choice. But if was stuck on a budget, Intel wasn't even a consideration.

The same holds true today. If you want to build the most powerful system you can for a majority of applications, you are going to get a Core2 system, and now a Kentsfield. But, if you actually have been placed on a budget, or say a fixed incoming, AMD's processors deliver much better bang per buck.

Now, bringing it back to Anandtech. I used to read Anand's site on a daily basis a little over a year ago I think. Anandtech tended to be pretty well balanced. Till, around after Core2 launched, I began to get the feeling that all the motherboard and CPU articles were selling Intel Products. I didn't see any investigations into the FSB issues with Core2. I haven't seen any parallels on how Conroe's Saturation of the FSB mimics the Barton Saturation of the FSB. I haven't seen anything about how Intel will need to move to a Direct Connect Like Architecture in order for newer processors to scale up in performance.

Instead, I read the article about AMD's 4x4 platform and I'm greeted by expressions about how Core2 dominates peformance. Em. No. It doesn't.

Over time I've watched as Anandtech has slowely splattered egg on Anand's face with... questionable content... on the site.

I really wonder, when users actually start getting hands on with Kentsfield, how big is the backlash going to be? How many people are going to realize that they've been mislead?

Scoring high in PCMark is one thing. Being able to run City of Heroes for 8 hours during a Hamidon Raid, or hosting a Web server under *nix for years on end are quite different. And I personally think that Kentsfield will be a lemon once it hits real work.

Tuesday, November 28, 2006

Feel like shooting me with a Paintball Gun? Close to South Carolina?

Alright, lets find out how many people want to take a paintball marker and shoot me with it...

Right now I'm intending to be at TriggerTyme's year end Scenario Game in Columbia South Carolina: The upfront cost is a $5 donation/toy for Toy's For Tots. Renting a marker, getting air, and getting paint will be a little bit more.

if you want more information, you can drop by the site and forum announcement:

http://www.triggertyme.com/

http://www.pbjunkie.com/forums/showthread.php?t=19623

Monday, November 27, 2006

Turn, Jump, Spin Around : OpenSUSE Irc on Novell Deal

Alright, I originally wrote this post over on linux.com. Basically, the long story short is that OpenSUSE hosted an IRC chat supposedly to go over the deal with Microsoft that was made by Novell.

Linux.com already has a slightly edited version up here

If you want to see the unedited text of the chat, you can click on this link

Anyways, this is my original thought over the event:
I'm still trying to plow through the un-edited version after reading the lightly edited version here, and, to be honest, this isn't what I expected.

I, personally, expected the IRC chat to be about how OpenSuse developers were going to handle the deal with Microsoft and Novell. I expected it to be about making plans on how to move forward, what programs were and were not going to be worked on, and possibly technical details about how exactly the interop center would actually be functioning.

I also expected the chat to cover any potential plans for mass-evacuation of OpenSuse developers to other projects if the worst fears were realized. I expected serious considerations to be given to Ubuntu, Debian, Gentoo, or RedHat development.

I expected questions and answers about how this might affect KDE and it's use on Windows, or how Novell might use KDE in the future. I expected questions about how Ximian and Evolution would be impacted by the deal, increased exchange server compatibility a possible topic.

What I did not expect, and what we got, was spin about how the deal was good, and about how nothing would really change in the long run. The overall tone of the IRC chat, and the questions that were not asked or addressed...

well, I'm going to state this: I have no problems with Non-Free software. I have no problems with Paying For software if it offers a reasonable price, after all, if I don't pay the coder or the designer, how are they going to afford to be able to work on the project in the future?

I have no problems with the non-free drivers at use in Ubuntu. Like it or not, Open-Source hasn't provided the performance the binary ATi and Nvidia drivers provide, and wireless with "free" drivers? Not happening on a lot of wireless chipsets.

Anyways, the point is, given what we just got from the OpenSuse chat, I'm far more comfortable staying with Mepis built on Ubuntu. People may complain about Mark Shuttleworths and Warren's use of non-free software, but they don't attempt to hide it or spin it.

Saturday, November 25, 2006

Lets talk about Hotel Wireless

Alright, right now I am up in Pigeon Forge, TN, on a family trip, and I've come across what could be described as a pet peeve of mine. Right now my laptops are only getting 5-15% of the hotels wireless access service, and that is driving me bonkers. I am just on the very edge of the service bubble, enough that a newer wireless adapter is having no problems maintaining a connection at low levels, but an older netgear card is going haywire and won't stop scanning channels.

(no problems includes 600ms ping times and 14k modem connection speeds, http://test.lvcm.com says I'm running at about 512k, Planetside and City Of Heros say I'm not).

Anyways, the reason why it drives me bonkers is that I cannot figure out who in the world would setup a hotel's wireless connection signal like this. It isn't the first time that I've been on the edge of the service bubble, while at E3 earlier this year (yeah, when I had a regular paying job), I had the same exact event occur. Just on the very edge of the wireless bubble, only with 0-5% signal.

I get the feeling that the hotels simply buy the Linksys / Dlink / whoever marketing blitz that the Wireless Routers will reach for however many feet, place a router, walk the distance out, walk it out again, place another router, and so on. It seems like it never occurs to a hotel to bother having a network specialist come in to advice them on wireless network building (I am available for this, and trust me, I will glady travel 500+ miles to help set up wireless network infrastructures). The walls, plumbing, electrical lines, elevators, microwaves, and everything else will wreck havoc with the signal. The only way to really be sure that all rooms have "good" access is to actually walk inside the room with a laptop, PDA, or wireless network location device and very signal strengths of 50%+. The point at which you begin to lose 50% signal strength is where you need to be making your halfway point at. Walk from that point, set another router, go back to the middle, measure the signal strength, and make sure that everybody can connect.


That way everybody who visits the hotel can connect at a decent speed.

Is that really so hard?


note, about most of the way through this we went away to the Black Bear Jamboree Dinner and Show, so most of the vitriol I have against the hotel has kinda dissipated, until I reconnected to "very low" signal strenth. 4%-7% range.

Thursday, November 16, 2006

If you bought a PS3... I really feel sorry for you

a lot of different reports are starting to come in from people able to get their hands on the Playstation3, and the reports are not good. Now, I'm just gonna link two of these reports so far for the sake of commentary. Source is Anandtech/Dailytech

Issues with PS2 compatibility
http://www.dailytech.com/article.aspx?newsid=4936

Lack of Upscalar
http://www.dailytech.com/article.aspx?newsid=4971

The first problem with the Playstation3 on launch is that it's PS2 compatibility list is far short of what was expected. As the Playstation2 contained the hardware for the original Playstation, the Playstation3 contains the Emotion Chip and Graphics Synthesizer (read: 1 128bit Mips CPU with 2 Vector Processors) that made up the Playstation2, so it was expected that the Playstation2 compatibility would not be an issue. When the Playstation2 launched a few obscure titles had some issues, but most of them were Japan only releases, and not really big sellers. So, imagine the shock people are getting with games like Xenosaga Episode II, Radiata Stories, Devil May Cry, and Suikoden III not working properly? Okay, so that's not the entire list of known problems, I just happen to... oh yeah. OWN THOSE GAMES. Other popular titles like Gran Turismo 4 and Tekken 5 also have issues.

Now, the excuses are already flying, with pointers to the lack of rumble in the Playstation3 controller, to the Virtual Memory cards instead of physical memory cards... but these... well, are not exactly good excuses. The primary reason these are not good excuses is the Operating System base for the Playstation2 and Playstation3. It is Linux. There should not have been any problems mapping the physical memory cards to virtual memory cards, and it should not have been any problem routing Rumble commands to /dev/null. To have software issues like this when you have complete control of the Operating System and the Hardware is inexcusable.

Granted, the most common problem isn't the game locking up, although that happens on a lot of titles. The most common problem is music getting off track, or no music at all. Which... Sony responded to with this gem:

some people can put up with playing games that lack sound

I am thinking that ranks up there with "5 million people will buy our console, even without any games." I really think the Sound Engineers working on Playstation3 titles need to think about how their work is being taken for granted.


The second major piece of news on the launch of the Playstation3 is the lack up an upscaler. This really isn't a big deal in and of itself. People who have set a 640*480 picture to be their desktop background on a 1024*768 screen, and then selected "stretch" know how bad the picture can be pixelated. But, in the case of the Playstation3, it supports resolutions of 480 interlaced, 480 progressive, 720progressive, and at least 1080 interlaced. However, a lot of High Definition Televisions on the market, including a lot of Sony models, do not support the 720progressive resolution. Many only support 480 interlaced, 480 progressive, and 1080 interlaced.

So, Playstation3 games that do not support the 1080 interlaced picture, but support 720 progressive, drop to 480 progressive picture.

Suddenly, Nintendo's decision to forgo anything above 480 Progressive makes a lot more sense, although it's not like I told everybody that... BACK WHEN THE GAMECUBE LAUNCHED. AND WHEN THE SPECIFICATIONS WERE REVEALED FOR THE 360 AND PS3. Oh wait a second... I did.

Anyways, the point is being made that at 480 progressive downscale, the pictures are, well, blurry. I hate to break physics to the rest of the Video Game Crowd, BUT TELEVISIONS ARE BLURRY! CRT televisions depend on the motion blur in order to fool your eyes into seeing motion. That is how televisions WORK.

Anyways, so it isn't like this was an issue that wasn't seen by engineers, especially by engineers who created the High Definition standards and how High Definition is implemented in a television set.

My problem with Sony on this is that they really should have known better, and should have made sure that development teams checked the 480 progressive output views on their games. I know Epic did with Gears of War. It looks great on a 480 interlaced, and a 720progressive screen.

Wednesday, November 15, 2006

Update : what am I up to

kinda need to give a status update...

right now I'm building the guides on 3 different systems, the processors listed below. I'll post more information about the systems later.

AXp 2500+ / Radeon x1600
K6-2 500mhz / Volari ... something. I honestly forget.
Intel Pentium 4 530 / Geforce 6800


and yes, I know, no 64bit systems in the line-up, and not really a fairly balanced version of systems in use, but it's what I have available.


OS is the most recent release of Mepis on DVD.

And... right now I have the following guides under progress

Opera: on-site Deb Repository Install
Opera: Ubuntu Source Install
Opera: .deb download / install
----
opera deserves better than the guide I wrote before: Couple of different reasons for all 3. The on-site .deb repositories might be different than the Ubuntu source, and the .deb download and install might be useful if you like to compile a CD or DVD with your favorite programs archived

Valve's Linux Servers : Counter Strike / HalfLife
----
forum request

CDcat: http://cdcat.sourceforge.net/
----
irc request. Seems interesting.

Mepis DVD Install
Mepis DVD install / home account recovery
----
might as well show the new process step by step.

Mepis DVD Install / Video Capture
----
need to load up a straight vid capture of the Mepis install. Just give people an idea of how easy it is. The catch for me is that I want to capture the video from Linux, not from Windows. However, I'm... still fuzzy on getting video capture working properly. Gonna give myself a couple of weeks to get video capture working. If I can't, I'll just run it through Windows...


-----------------------------------------------------


What would be nice for future guides:

I'd really like to redo the Unreal Guides using the Unreal Anthology pack, but as mentioned earlier, I'm kinda short on the... cash... side of things. I'm also interested in trying some more video cards in the systems... (so ATi/Amd... nvidia? hello?)

yeah, being selfish on that part asking for testing equipment... but then again, 2 weeks ago I couldn't have imagined asking for support on the guides, and I got a response that sent me slackjawed.

------------------------------


one... more... item. I'm kinda... well... fuzzy on a point. I'm hesitant to list those who donate because of privacy issues... I just don't want to throw people's names out there...

so... kinda need some feedback... would you want your name listed? or would you rather stay anonymous?

alright, back to work. the K6 system takes forever to do anything.

Friday, November 03, 2006

*blink*... *blink*

Okay. um. Wow. I was not expecting that reaction...

Right now I've had over $100 donated so far, so that means 2 new guides, or 4 remixs.

While I'm still boggling over that, now for some more technical issues.

What I need to figure out how to do next is publish this blog directly either the Mepisguides.com/index.html file, or the Mepisguides.com/fti/index.html file...

I also need to figure out some way to do a counter on the front page, because at this point, I'm now placing myself in the hands of others, and I want them to be able to keep track of what I do with their monies.

Am I nervous? Let me put it this way. I no longer sweat when facing a Windows Registry with stripped permissions. I'm sweating now.

Wednesday, November 01, 2006

Ouch... I guess I'm L.F.J. now.

Alright... ... I'm kinda embarrassed to write this, don't really want to... but here I go.

I used to write the Live journal over at http://saist.livejournal.com/ I had started that livejournal when I went to work for Sitel, an outsourcing company. We worked for Cox Communications for about 3 years, then our assets were sold to Suddenlink. About 3 months after Suddenlink took over, I finally quit Sitel. The paycheck just was not worth the amount of pain we were being put through.

After I left Sitel I joined up with a local gaming center, for a couple of reasons. The owner had been a friend of the family for years and he was looking at a theft problem that had been continuing since the gaming center had opened. My initial job, as such, was to basically babysit the place. Get it open for business and make sure nothing went out the door.

Unfortunately, when the past months I've feel victim to the thefts as well with somebody stealing system memory from my computers, my Nintendo DS has gone missing, several of my Nintendo DS games have also gone missing, and I've gotten shorted cash on the Soda fund. Basically, I've gone broke from the Tournament Center.

And thats where I am now. Yesterday I gave the owner a choice. Either he could get rid of the person who I believed was committing the thefts, or I was walking out the door. The owner chose to keep the thief.

So, as of now, I am jobless, and broke, and I haven't a clue what I'm going to do from this point. Part of me wants to start Mepisguides.com back up, but the guides are fairly time consuming considering the way I do them. Another part of me wants to go back full time into tech support or into fixing computers again.

Now, when I was updating Mepisguides on a regular basis (how long ago was that?), I had several emails from people wishing to donate money. I turned those requests down asking that money be sent to Cblue or Warren for Mepislovers or Mepis itself. Now... as embarrassed as I am to admit it, any of those donations would have been nice...

So... I guess here's an outline for the deal. If people still want to donate to this loser, you could donate via paypal to mepisguides [space] @ [space] gmail.com

Part of me wants to guarantee a new guide for every $50 donated, or a remixed guide for $25. Remix being updating existing guides in Mepis6.

That... should allow me to continue to pay off bills for food and electricity, and if the pace keeps up, might turn into a livable income.

The other outline is that if you do want to actually hire me, I've put my most recent resume up:

http://www.fcs-inc.biz/saist/trg/resume.html
http://www.fcs-inc.biz/saist/trg/resume.odt
http://www.fcs-inc.biz/saist/trg/resume.pdf

And yes, in 3 different formats, HTML, Open Document, and Portable Document formats.

While I would personally prefer continuing to work in the Augusta Georgia Area, I'm realizing in my current situation that if I'm offered a job somewhere else, guess I will move.

If you are interested, please send emails to jason.frothingham [space] @ [space] gmail.com


I'll... update the blog with something else more in the lines of what you probably would want to read later. For now though... I still have some systems on the test bench I need to finish up.

Oh, and if you are interested in a Mepis Linux loaded computer and you want to buy one... I'm willing to work on those too.

(and don't forget to remove the [space]'s from the emails.

Monday, October 16, 2006

Operating System Security:

By now, if you pay attention to any tech site, you've probably seen or heard about the issues that McAfee, Symantec, and other computer security firms have with Microsoft's Vista OS. A general sampling of the plot summery can be found, say like at OS weekly.

Now, as I see it, the issue is fairly simple. Based on reports, like one here on Anandtech's Dailytech site, Microsoft is working directly with Kapersky and cutting other security Vendors out of the Vista development processor. Established firms like McAfee and Symantec are having none of it, and want full access to the Vista development process. You really can't blame them, after all they do depend on Microsoft Windows to exist. You can easily find several articles covering the tit for tat back and forth between the security firms and Microsoft, but that isn't what I want to focus on.

I want to look at why this is a problem to begin with. Microsoft already tread on a lot of IT toes with Windows Xp by proclaiming it as the most secure Windows version ever. Technically, this is true, if hadn't been for the use of the WinNT 5 kernel with over 6,000 known unfixed issues with Windows 2000, that Microsoft is never going to fix. That isn't the issue either that I want to look at.

Let us be honest here, we as consumers take for granted that our Windows Operating Systems are going to be attacked. We take for granted that there are going to be virus's, there are going to be spyware programs, there are going to be ad-ware programs, and there are going to be complete and total jackholes with nothing better to do than write malicious code. We, collectively, as consumers have accepted this. We have, collectively, gotten used to running Virus scans, disk cleanup, disk defrag, and -chkdsk /f /r (enter) (y) (enter) exit: reboot, on a weekly or daily basis. Collectively as consumers, we rolled our eyes when the most recent Vista builds get their security control schemes cracked, and collectively we went "Why did it take them that long anyways?" Collectively, as consumers, we've come to expect, and have them delivered consistantly, Microsoft's complete ball drops when it comes to user security.

Microsoft, of course, is fond of comparing their operating system security to Linux, their only real competitor. Now, I could link several stories focusing on Linux vs. Windows security, but lets accept this little factor: Both Operating Systems can be just as Secure as the Other.

Yes, I went there. The fact is, if you take the time to lock down user permissions, lock out ports, get a good hardware firewall, and lock general user access, both Linux and Windows can deliver similar security enviroments. The question that needs to be asked in the security debates is if you are comparing apples to apples, or oranges to oranges, or apples to oranges.

The fact is, Linux and Microsoft offer two different types of products. Microsoft offers a complete operating system under one brandname. Microsoft controls everything from the TCP/IP stack, the Desktop Enviroment, the Window Manager, the file system, the I/O access, the kernel access, and everything else needed to be able to run the Operating System. On top of the Operating System Microsoft offers text editors, web browsers, media players, and control devices that allow the Operating System to be used.

Linux, however, is just the kernel. While the kernel is the core of the operating system, it needs more programs in order to actually do anything. You can use the command prompt, yes, but if you want to do anything with the Kernel, you are going to need a file system manager, I/O access, TCP/IP for the network, a window manager and maybe a desktop enviroment. While it is taken for granted that a Linux distribution has these items, they are items that are added to the Linux Kernel.

This is where the comparisons of Windows to Linux generally fall apart. Linus T., Alan Cox, Marcelo Tosatti, and Andrew Morton do not make or sell a Linux distro directly, although they are among the most prominent kernel programmers. Other companies like Red Hat, Novell, or Mepis take the Linux Kernel and combine the kernel with the tools and applications to make a useable distribution. Typically these tools and applications derive from the Gnus Not Unix (GNU) Operating System, although the range of Free and Open Source Software today can mean that a lot of Non-GNU software is used in a Linux Distribution.

Generally the advantage to being able to select all the different parts of your operating system is that there is a finer control over what goes in your operating System. Consider this about Linux: If you do not need an Instant Messanger client or a media player, you don't have to have one loaded. If you do not need an internet browser, you are not forced to load one. If all you need is the kernel, I/O access, file system manager, TCP/IP, and a Web Server, you can get that, and just that alone with Linux.

The disadvantage to Linux is that you are given control over goes in your Linux distro. Consider the average user for a second who does not have a clue what program they use. Most of us know somebody who when asked what browser they were using, they responded with Windows. Most of us probably also clarified, that's their operating system, what browser are they using. The average consumer doesn't know what programs they use because they do not care about what programs they use. Yes, we can make charts all day long and take screen shots of the tools we use to clean up their computers and tell them why certain programs are bad, but to the average user a computer is a tool. If it doesn't work, replace it with one that does. Now, imagine taking this user and putting them in front of a Linux Distribution for the first time.

To pick on Mepis for a minute (hey, I use it), the system comes with these programs: Kate, Kedit, Kwrite, and Open Office Writer. These are all programs used to write with, some offering more advanced functionality than others. If I add NVU/Kompozer, I now have 5 programs that all let me write stuff.

How many do... I actually need?

On lower end systems, say like a AMD K5, Kate is a much better choice than Open Office. It's much lighter and much faster. But, if I'm using say, and AMD Athlon64... Kate is... pointless. Open Office offers much more document functionality. But how do users know what program is right for them? How is the common consumer going to know that if they are using a Pentium processor, they are better off not using Open Office, but if they are using an Athlon, they have the performance to run Open Office?

The point is that since the average consumer does not know what programs are good for use or not on their Linux distro, they are dependant on what the Vendor sets as the default. Microsoft sets it's own programs as the default for everything, and the user has some assurance that these programs will work somewhat in the way they are intended to work. The same cannot be said in the Linux Distribution world where the defaults can change wildly from distribution to distribution. I am not saying that's a bad thing. That is one of the good things about Linux, there is a distro for everybody to use.

What is important to keep in mind here is the amount of control that is relenquished to the vendor. Again, this is not a bad thing. If you want to hand-peice your own Linux distribution from the ground up with .deb repositories, or if you want to compile from scratch, that's your prerogative. I'm not going to stop you, nor would I dissuade you. For the average user, or for the person who isn't keen on taking the time to do so, having someone else piece together the distribution is the way to go.

And that is where we start picking away at the key security differences between Microsoft and Linux, and why business's like McAfee and Norton can exist in the Microsoft world, but don't really have a place in the Linux or Unix markets.

Some Vendors, in order to make their Operating System easier to work, will make design choices that remove a lot of the visible security features, such as Linspire. It doesn't mean that the security isn't there, it just means you'll have to work to turn it on. On the other extreme you have OpenBSD (not a Linux, but still *nix), which hand reviews everything that makes up the code, and sets the tightest possible security standards. Your security is only going to be as tight as the vendor sets unless you, the person using your operating system, takes the time to go in and change the settings.

The fact is with McAfee and Symantec is that they built their security Empires on Microsoft's behavior. Microsoft built their operating system without reguards to security. Like Linspire does with their Linux Distribution, Microsoft was more concerned with making their product USEFUL, than they were about making it secure. I am not saying this is, or was a bad thing or a bad goal, especially in the days before I1 (Internet1) was established. Again, behind honest, when Windows 95 came out... would you really want to use a Macintosh or an MS-DOS box? Would you really want to use the ... um CDE? was it? On unix? As a home user, Microsoft was the only realistic choice availabe.

Not a bad start... and, again, one I can't fault Microsoft for. Like the Xbox did for taking console gaming mass market in the US, Microsoft Windows brought in entire new generations of computer users who would never have dreamed of learning a Unix System.

The security problem really came afterwards with the advent of Windows "New Technology," or WindowsNT, and the prevalance of networked computers. During the time in which green screened Bulletin Board Systems (BBS) gave way to the World Wide Web, Microsoft was also working on their new version of Windows, while appearently completely failing to account for the trends towards networking.

One of the main reasons why Symantec and McAfee were able to exist was derived from Microsofts abysmal track record with code security. While Microsoft's Windows and Internet Explorer products are held as the most viled examples what proprietary products can degenerate to, the real factor was the attitude behind the code choices.

Microsoft generally coded with the idea of what users could do IF the system was compromised. McAfee, Symantec, and other security vendors established their business on the CLEAN UP of compromised Windows Systems. Consider the virus Michelangelo for a second. Symantec got massive amounts of publicity for a non-issue Virus, and made a name based on a free detection (what about removal?) utility. It was not until later in the life of the Security industry that there was a focus on PREVENTION of threats before the system was compromised. The entire industry started out cleaning up known and public threats. What did Microsoft do? Relatively nothing. Microsoft did not change procedure or coding methods in any way to accommodate or account for the virus threats. There was no policy shift towards code responsibility at all. Microsoft's cavalier attitude is one of the main reasons why most security professionals are not expecting Vista to be the security bunker Microsoft is promoting and promising. Vista's already been cracked, multiple times. The kernel protection (PatchGuard) has already been breached. Has anything changed from the past versions of Microsoft Windows? Doesn't look like it.

Unix Vendors, then Linux vendors, generally held a different view of security. Their view is of what users can do WHEN the system is compromised. Their motives also where drastically different. Microsoft aggressively sought after the home market, forming an entire new market segment for computers. Unix vendors were after big money targets, selling to institutions like Banks, Stock Exchanges, Point of Sales, and Power Stations. To Unix vendors, it wasn't a question if someone wanted to break into the system, it was a question of when someone would attempt to break into the system. Because of the different markets being sold to, Unix vendors took security far more seriously than Microsoft ever has.

Consider the /root and /user model. While the user has some access to use the system, the /user cannot access anything above their own directory. That means that the /user cannot make any changes to the Operating System itself in a *nix system. Sure, there might be a virus that can enter the system and compromise the running session, but what if that session is the /user mode? Guess what? The /root mode is unaffected. Cleaning up a virus infestation is as simple as dropping back to /root and deleting the Users folder right out of the system. Recreate, or recover from a backup, and roll on. Security can even be taking to another level where the /root and /user accounts can be on two seperate partitions on the drive, or possibly on two different drives. That's just simple basic security that has been in place for decades in *nix. Microsoft's only getting to this model with Vista. What took so long?

Historically speaking, Microsoft focus's on the if, banking on the possiblity that the system will not be compromised, and with Vista takes the stance that they can protect the system From Being compromised. Unix and Linux programmers build on the idea that the system is going to be compromised. It may not be now. It may not be for a long time. The system is built to minimize the impact of what will happen when Malicious code is developed.

That single attitude difference is one of the reasons why McAfee and Symantec, as well as other security vendors, probably will not ever have a market in the *nix world. It is not that intrusions will not be coming. It is not that hackers will not try. It is that *nix systems are built with multiple layers of protection, and code isolation, that make deep penetrations near impossible to pull off without a complete collaspe of the /root account. Despite hackers having full access to how the kernel is built, how the I/O access works, how the file system works, how everything in the OS works, when it comes to Linux, you have not seen anybody making viral applications like "Iloveyou."

The example I like to make is this: You work as a RePossessor for a bank. Your job is to go and either retrieve cars, or disable cars, that payments are not being made on. You have two cars to go after.

One of these cars you have never worked with before. The hood is welded shut, the doors are locked, the gas tank is locked, but the underbelly is exposed. You have some tools that may allow you to pick the lock and get into the interior. Someone may have even given you a copy of a key, you just don't know if it's the right key. You might have some skeleton keys on hand as well.

The other car you need to go get has an open hood, the windows are rolled down, the gas cap isn't tight, and there is a jack on the back seat of the car. You are fairly confident that the skeleton keys you have probably can fit the ignition. The catch is that ignition will accept several types of keys, but only one key will get the car moving.

This analogy is a bit over the top, I'll be the first to admit that. The first car is indeed windows. You don't have any access to the kernel, or the engine. You can get to the storage system (gas tank), but then again, you might not. You may not have a correct user account (keys), but there are tools available that can brute force (skeleton keys, bashing in the windows) your way into an account, or you can use other tools to try to interupt the storage system's converstation with the kernel (keyloggers / network sniffers).

The second car is Linux. Everything is fully documented. There are no secrets about the engine, you can get to that. There are no secrets about the gas tank, the storage system, you can get to that. You can interupt the transmission, or the I/O system, there are no secrets there. You can look at the user accounts, there aren't any secrets there about how they work. You know everything about this car and how it works, everything is opened up to you.

The single catch is that single key that's needed to drive the system. Now, I don't know how many people do Cryptography, but the subject matter is similar. Now, I just did a search on google for these terms : cryptography still secure even if the process is known. I turned a link from CSA on the subject, and there were several other entries listed. The purpose of Cryptography is to hide or disguise content by encrypting the contents. Most of us as kids probably had fun playing spy and writing letters in invisible ink, or by using Cereal Box Decoder rings so that the big grown ups would have no idea about our afternoon plans. Some of us had fun breaking these encryption codes. Bring that forward to today, where security is an issue. A good encyrption system is one where even if the process is fully known and documented, you cannot break the code without the original key. Consider ROT13 for example, or Rotate 13. If you know that somebody used ROT13 on some text, you would just use ROT13 again to read it.

Lbh qvq gung evtug? Tbbq sbe lbh!

The point is that unless you have the root code, you still can't really do anything with the Linux car in the example. Sure, you might be able to turn on the radio with your user account, and you might be able to get the engine to fire, but without the single correct key, the Linux car is going nowhere. That is the difference right there. *nix systems are built with the key in mind first, then the rest of the system is built around it. After the key is in place, then you can work on locking down access to the engine, to the transmission, to the gas tank, to the windows, and everything else. The system, however, is not dependant on the extra parts to be there. The security will still function if all that is left is the single /root key.

Now, if you want to draw a parrallel from the analogy along the lines that malicous code writers for Windows are idiots, that's your choice. It just seems to me that it would be a lot easy to break a system that is fully documented and that is completely open than go after one that you have to look for backdoors in finished code on. Be also a lot more damaging considering how many major resources use *nix systems. Taking down 5million Windows computers is one thing. Dropping a power plant offline? Killing a train switching yard? That's real damage.

The fact is until Microsoft realizes that the problem needs to be addressed by figuring out the protection steps needed to halt an intrusion WHEN the intrusion occurs, not IF an intrusion occurs, or pretending that all intrusions can be prevented FROM happening, then McAfee and Symantec will always have job security.

And *nix systems will always be more secure.

Okay, so that wasn't exactly quick was it

Yeah, the quick comment turned out to not be so quick. Well, it's about to get longer. Rather than editing the old post, I'm going to leave it up. An extended edition will be published... soonish.

Thursday, October 12, 2006

A quick comment on the McAfee / Symantec / Microsoft kerfuffle.

By now, if you pay attention to any tech site, you've probably seen or heard about the issues that McAfee, Symantec, and other computer security firms have with Microsoft's Vista OS. A general sampling of the plot summery can be found, say like at OS weekly.

Now, as I see it, the issue is fairly simple. Based on reports, like one here on Anandtech's Dailytech site, Microsoft is working directly with Kapersky and cutting other security Vendors out of the Vista development processor. Established firms like McAfee and Symantec are having none of it, and want full access to the Vista development process. You really can't blame them, after all they do depend on Microsoft Windows to exist. You can easily find several articles covering the tit for tat back and forth between the security firms and Microsoft, but that isn't what I want to focus on.

I want to look at why this is a problem to begin with. Microsoft already tread on a lot of IT toes with Windows Xp by proclaiming it as the most secure Windows version ever. Technically, this is true, if hadn't been for the use of the WinNT 5 kernel with over 6,000 known unfixed issues with Windows 2000, that Microsoft is never going to fix. That isn't the issue either that I want to look at.

Let us be honest here, we as consumers take for granted that our Windows Operating Systems are going to be attacked. We take for granted that there are going to be virus's, there are going to be spyware programs, there are going to be ad-ware programs, and there are going to be complete and total jackholes with nothing better to do than write malicious code. We, collectively, as consumers have accepted this. We have, collectively, gotten used to running Virus scans, disk cleanup, disk defrag, and -chkdsk /f /r (y) (enter) exit: reboot, on a weekly or daily basis. Collectively as consumers, we rolled our eyes when the most recent Vista builds had their security control schemes cracked, and collectively we went "Why did it take them that long anyways?" Collectively, as consumers, we've come to expect, and have it delivered consistantly, Microsoft's complete ball drop when it comes to user security.

Microsoft, of course, is fond of comparing their operating system security to Linux, their only real competitor. Now, I could link several stories focusing on Linux vs. Windows security, but lets accept this little factor: Both Operating Systems can be just as Secure as the Other.

Yes, I went there. The fact is, if you take the time to lock down user permissions, lock out ports, get a good hardware firewall, and lock user access, both Linux and Windows can deliver similar security enviroments. The question that needs to be asked in the security debates is if you are comparing apples to apples, or oranges to oranges.

The fact is, Microsoft offers a much more complete operating system. They control everything from the TCP/IP stack, to the Desktop manager, to the file system, to the I/O access, as well as the kernel access.

Linux, however, is just the kernel. While the kernel is the core of the operating system, it needs more programs in order to actually do anything. You can use the command prompt, yes, but if you want to do anything with the Kernel, you are going to need a file system, I/O access, TCP/IP for the network, a window manager and maybe a desktop enviroment.

This is where the comparisons of Windows to Linux generally fall apart. Linus T. himself doesn't make or sell a Linux distro. Other companies such as Red Hat, Novell, or Mepis take the Linux Kernel, tools and applications from the GNU software, and other applications, puts them together, and that is your finished distrobution.

The advantage is that you have more fine control over what goes in your linux distro. If you don't need an IM client or a media player, you don't load one. If you don't need a browser, you don't have to load one. If all you need is the kernel, I/O access, file system, TCP/IP, and a Web Server, you can get that, and just that alone with Linux.

The disadvantage is that you have more fine control over what goes in your Linux distro. If you don't have a clue over what you need or do not need, and again, for the majority of consumers, they probably don't, you can easily bog down your system. To pick on Mepis for a minute (hey, I use it), the system comes with these programs: Kate, Kedit, Kwrite, and Open Office Writer. These are all programs used to write with, some offering more advanced functionality than others. If I add NVU/Kompozer, I now have 5 programs that all let me write stuff. How many do... I actually need? On lower end systems, say like a AMD K5, Kate is a much better choice than Open Office. It's much lighter and much faster. But, if I'm using say, and AMD Athlon64... Kate is... pointless.

The point is that if you don't know what you are doing with your Linux distro, you are dependant on what the Vendor sets as the default. Some Vendors, in order to make their Operating System easier to work, will make design choices that remove a lot of the visible security features, such as Linspire. It doesn't mean that the security isn't there, it just means you'll have to work to turn it on.

On the other extreme you have OpenBSD (not a Linux, but still *nix), which hand reviews everything that makes up the code, and sets the tightest possible security standards.

Like Microsoft Windows then, your chosen Distro from Linux may not be all that secure. And now we are getting to the point I've been aiming for.

The fact is with McAfee and Symantec is that they built their security Empires on Microsoft's behavior. Microsoft built their operating system without reguards to security. Like Linspire does now, Microsoft was more concerned with making their product USEFUL, than they were about making it secure. All in all, this wasn't a bad goal, back in the days before I1 (Internet1) was established. Again, behind honest, when Windows 95 came out... would you really want to use a Macintosh or a Dos box? Would you really want to use the ... um CDE? was it? On unix? As a home user, Microsoft really did provide the only choice.

Not a bad start... and I for one can't fault Microsoft for doing that. Like the Xbox did for taking console gaming mass market, Windows brought entire new computer users in who would never dream of learning a Unix System.

The problem really came afterwards with the advent of Windows "New Technology" or WindowsNT, or rather the attitude. And that is the point of this:

One of the main reasons why Symantec and McAfee were able to exist was derived from Microsofts abysmal track record with code security. While Microsoft's Windows and Internet Explorer products are held as the most viled examples what proprietary products can degenerate to, the real factor was the attitude behind the code choices.

Microsoft generally coded with the idea of what users could do IF the system was compromised. McAfee, Symantec, and other security vendors established their business on the CLEAN UP of compromised Windows Systems. It was not until later in the life of the Security industry that there was a focus on PREVENTION of threats before the system was compromised. Microsoft's, cavalier attitude, is one of the main reasons why most security professionals are not expecting Vista to be the security bunker Microsoft is promoting. Vista's already been cracked, multiple times. Nothing has really changed.

Unix, then Linux vendors, however, generally have a different view of security. Their view is of what users can do WHEN the system is compromised.

Read that again. Microsoft focus's on the if, banking on the possiblity that the system will not be compromised. Unix and Linux programmers build on the idea that the system is going to be compromised. It may not be now. It may not be for a long time. The system is built to minimize the impact of what will happen when Malicious code is developed.

That single attitude difference is one of the reasons why McAfee and Symantec, as well as other security vendors, probably won't ever have a market in the *nix world. It's not that intrusions won't be coming. It's not that hackers won't try. It's that *nix systems are built with multiple layers of protection, and code isolation, that make deep penetrations impossible to pull off. Despite hackers having full access to how the kernel is built, how the I/O access works, how the file system works, how everything in the OS works, when it comes to Linux, you don't see anybody making viral applications like "Iloveyou"

The example I like to make is this: You work as a RePossessor for a bank. Your job is to go and either retrieve cars, or disable cars, that payments are not being made on. You have two cars to go after.

One of these cars you have never worked with before. The hood is welded shut, the doors are locked, the gas tank is locked, but the underbelly is exposed.

The other car you need to go get has an open hood, the windows are rolled down, the gas cap isn't tight, and there is a jack on the back seat of the car.

Get the picture? Yeah, the first car is your Windows OS, and the Second car is Linux. If you are competent at your job, you are going to take the second car first. And yes, if you want to draw the parrallels that programmers for Windows are complete clueless morons, feel free to do so. When your door gets busted in by Ballmer with his chair, I was never here.

Silly analogies aside, the fact is until Microsoft realizes that the problem needs to be addressed by figuring out the protection steps needed to halt an intrusion WHEN the intrusion occurs, not IF an intrusion occurs, then McAfee and Symantec will always have job security.