Wednesday, October 31, 2007

looking for specific hardware

Last week I put the following up on the Guide Site. I think enough time has passed to run it here as well.
This is also the first time that I am... well, directly requesting two particular hardware donations. While I have access to Core2 systems, because of my reccomendations to the buyers the Core2 systems do not use Intel chipsets. Combined with the slow death of the Sager Tablet that was used to create the Mepis 6.5 Intel dpkg run through, I am without an Intel chipset system of my own to work with.

So, I am actively seeking donations to cover purchasing an Intel chipset based motherboard, or a donation of a recent Intel Chipset motherboard.

By the same token, referring to a page on the blog, Nvidia has aggressively attacked their driver issues with Linux support. After selling the (in)famous Clevo D900T laptop, I went with a smaller sized, but cheaper Asus F3K laptop with a Radeon HD GPU. While this allows me to cover Radeon HD support as it enters Mepis... I am without a Geforce 8x00 series card to test with.

So, I am actively seeking donations to cover obtaining a Geforce 8x00 series card, or a Geforce 8x00 series card directly.

The contact information and emails are listed on the Guide Site (all the way inconveniently at the bottom I know). Here, I guess I feel a little bit more brazen about specifying what I'm looking for.

Specifically, on the Intel side, I'm looking for Integrated chipsets with the Intel x3000 class video adapter. I'd actually want one of the G35 chipsets with the Shader Model 4.0 x3500 IGPU, but since I was unable to find any for sale (anywhere aside from a toshiba laptop), that leaves things like the ECS G33T-M2 LGA 775 Intel G33, or the slightly more expensive Intel BOXDG33TLM LGA 775 Intel G33. With the G33 chipset and x3100 IGPU, they claim to offer Shader Model 4.0 support... but strangely enough... not OpenGL 2.0 support.

And while I'm being very cheeky in putting this up here... I don't have the guts to ask Intel directly for a G35 test motherboard... not with everything I've said against Intel just on these pages alone.

From the Nvidia side, what I'd like to get is something like the XFX PVT80GGHF4 GeForce 8800GTS 320MB 320-bit card. But, at nearly $300 price tag, I'm not holding my breath. The slightly more reasonable EVGA 512-P2-N773-AR GeForce 8600GTS 512MB 128-bit GDDR3 would also be good... only at $200, not holding my breath there either.

Truth be told, all I really need for driver testing is something like Leadtek's PX8600GT 256MB Standard GeForce 8600GT 256MB 128-bit GDDR3. It's only $100 and I don't feel quite as bad wishing for one.

So, why the change from downplaying both Intel and Nvidia products to requesting engineering samples, donations to cover purchasing the hardware, or asking for the hardware directly?

The main reason is, I am supposed to be an objective journalist. Sometimes though, BEING objective requires doing things that go against my own personal wishes. Nvidia fed me a big fat shut up sandwich when memory handling problems in the Geforce4 drivers were fixed in a recent legacy driver update. Nvidia's visible work in Linux drivers has drastically improved with multiple monthly releases addressing issues in a timely manner. Now, while I personally already have samples of every previous Nvidia hardware class, aside from the FX series, I don't have any 8x00 hardware. That leaves me in a poor position to be accurate in discussing the real life usage state of Nvidia's current hardware lineup.

By the same token, I rail against Intel just about every chance. I haven't tried to hide or disguise my opinions of Intel. However, while I have access to Core2 systems... I don't actually have any Core2 systems or modern Intel IGPU's on hand to test with or work with. That leaves me in a poor position there as well.

Thing is, if I'm not willing to turn around and review what I say, or accept the possibility that what I say could be wrong... I'm not much of a journalist.


somebody also asked in an email recently just how thorough my benckmark testing is. Well, in one of these let me show you moments, I tossed one of my older benchmark sessions online some time ago. It can be found here :

Saturday, October 27, 2007

Boiling Water : Windows

One of the basic problems facing Linux conversions these days is the person who says something to the effect of "I've used Windows Xp for the past 4 years and I've never had any problem with it. I've never had any blue screens, I've never had any crashes or lock-ups, and I've never had anything go wrong. What are you Linux guys doing with Windows to say my system is unstable?" The truth is, from a average users point of view, NT5 was a really good release from Microsoft.

How then does one approach the users who honestly believe there is nothing wrong with using Microsoft Windows? How does one approach somebody who has never actually had a major problem with Microsoft Windows? How does one convince such a user that Microsoft Windows is a bad operating system to be using?

There isn't, as far as I see, a simple answer. One of the starting points is to ask what anti-virus and anti-spyware device the Windows users has. The leading point then is to point out that with all existing known Linux Distributions you do not need to run an anti-virus program or anti-spyware program. There are some available if you want to run them, but they are not requirements for day-to-day use.

Of course, there always will be somebody who states they DO NOT run an anti-virus or an anti-spyware program and that they STILL have never had any issues or problems with Microsoft Windows. Problem is, you can't really come out and call such people liars, even though empirical evidence is heavily stacked in the favor that anybody who states such is lying through their teeth.

So, the next step is to ask what other programs they use on their computer to connect to the internet. You'll find that many respond that they use FireFox, Opera, Thunderbird, or some other browser and email combination that does not include Microsoft Internet Explorer, Microsoft Outlook, and Microsoft Outlook express. Considering that those are the main entry points of issues into the Microsoft Windows operating systems, it is almost a simple matter to point out that the day to day programs in use happen to be the same ones available in Linux. It also helpful to point out that Open-Source applications are generally built to a higher security quality than Microsoft applications.

If users respond saying they do use Internet Explorer and an Outlook product, at that point? Yes, feel free to laugh and call them liars.

The question is though, how did it get to be this way? How did Microsoft come off of the reviled Windows 9X releases, go through a messy launch cycle with Xp, to several years later having users state they never had any issues with their Xp system?

The answer is found by using a Frog, some water, a pot, and a stove. Put some water in the pot and bring the pot to a boil. Drop the frog in, and the frog will more than likely immediately jump out. Okay, drain the pot and let it cool, then re-fill it with water. Put the frog in the pot, then turn the stove on. As the water comes to a boil the frog will likely not remove itself, and generally die in the boiling water.

Familiar with the experiment? Okay, now replace the pot of Water with Microsoft Windows, and the frog with the user.

Take a user who has never used Windows before, and they'll go crazy with how bad everything seems to operate. You'll also get a long list of things that seem broken or just plain nonsensical.

To a user that has only used Microsoft Windows? It is just like being the frog placed in the bucket of water before the water is heated up. Users get desensitized to the need for security software, and the need to leave automatic updates enabled. Users get familiar with the de-fragmentation process, as well as normalized maintenance of a Windows Computer. To many users, it isn't such a big deal that cleaning out the computer is a daily task, or that the first step when getting a new computer is to go get a better browser, a better email client, a better music player, and so on and so forth.

What makes the distance even greater is that the average user never has actually installed Windows XP. The average user sends the computer off to a specialist (me) in order to get any apparent problems addressed. So, the end user never winds up sitting on the phone for 2 or 3 hours to get their version of Windows Reactivated. Many OEM's actually ship their versions of Microsoft Windows without the activation software, so if the user has to repair the OS or recover, they never run into the problems retail buyers run into.

At some point users just decide they can deal with the little things, and that's why changing such users to Linux is such a challenge.

Linux is a pot of boiling water, period. From whatever system users have had before, most Linux Distributions are completely different. There is a learning curve of the new user interface, the new program locations, and the structure is completely different. After a while Linux begins to simmer down as users get used to the heat. After a while? Well, anybody can get comfortable in a sauna given enough time.

Conversion problems are further complicated by the average users perecption of the pro-Linux communities. Many average users see the rants about Microsoft's actions on various pro-Linux sites and genuinely wonder what the big deal is? Why is everyone so upset at Microsoft? What has Microsoft ever done to them?

Part of this wondering is derived from Microsoft's political friends. We've already been down that path before. Politically Microsoft is aligned with ABC, CBS, NBC, Reuters, and the Associated Press. These news outlets have no desire, nor any inclination, to accurately report the issues with Microsoft.

One of the basic facts of Microsoft is thus continually overlooked. Microsoft is a federally convicted criminal. Microsoft has been tried in a court of law and found guilty of breaking numerous trade laws, as well as criminal laws.

Microsoft's links to the RIAA, the MPAA, and other freedom removing organizations is also generally overlooked. A lot of users have read about the Palladium and Janus technologies. A lot of consumers were adamantly against any such restrictions being placed on their purchased hardware. However, with a few name changes, and some creative marketing, and very few consumers realized that Janus technologies formed the basis of the Zune sharing system. Even in many of the tech site reviews I saw, very few put two and two together and figured out that Zune was using the failed Janus technologies.

A lot of users don't realize that Microsoft has a hand in the quoted Trusted Computer Platform which explicitly removes the users control over the hardware and leaves the control in the hands of the vendor.

A lot of users don't realize that Microsoft has a hand in creating restriction technologies that allow vendors to retain control over audio and video content and remove the users control over the content.

A lot of users simply don't see two and two being put together, and don't understand that Microsoft is a convicted criminal that is still engaged in multiple criminal activities. Such users have been in the boiling water of Microsoft for so long, it makes no difference.

How to break that barrier? How to get such users to realize that what Microsoft is doing is wrong and needs to be avoided at all costs?

I don't have the answer. If I did, I'd probably also have the answer as to why several voters honestly think that Hillary Clinton's Socialist platform is a good thing, and don't realize that such a platform already tanked, BADLY, in the (former)USSR.

Yes. I did just equate Hillary Clinton to a post WWII start of the cold war era Soviet leader. And anybody else who reads her platform will come to the exact same realization. Now, shes' a buddy of Microsoft.

Do I really need to actually continue this line of thought?

Friday, October 26, 2007

Why Windows Selling better than Linux does not bother me.

Recently there has been an article floating around touting that Windows is selling better than Linux. Now, there are a multitude of ways to discredit an article these days. Do as I did with Walt Mossberg and go after what he doesn't write about. Note that crucial facts or history are missed or distorted, and credibility of the writing is removed. Another way is to go after the funding of the article or report, such as the numerous reports funded by Microsoft with pre-determined results that Microsoft Windows is more secure, faster, or whatever, than a competing Operating System.

With Windows and Linux sales though... any story reporting sales between the two starts out on a false assumption. That assumption is that Linux and Windows are distributed through selling of a physical product or a license. The problem with the assumption is that many Linux distributions are explicitly licensed so they cannot be purchased on a hardware medium. The result is that most Linux's are distributed through free to access FTP servers, or simply copied and handed out to other users. In the cases where a consumer Linux has an attached serial number or purchased license, most users will avoid the product. This is something Mepis learned first hand during it's experience with TechAlign and Frontier Linux.

The fact is, the majority of Linux's growth as an operating system has come at the hands of grass roots activists. The majority of Linux's growth continues as a grass roots activity.

This is not to say that the work of IBM, Novell, RedHat, Dell, HP, Lenovo, AMD, and other hardware and software vendors is not having an impact on the growth of Linux. Due to the work of many Independent Hardware and Software vendors Linux is found everywhere from the consumer desktop to dedicated devices like Tivo and Playstation.

The problem is, comparing sales of Windows to Linux completely ignores the grassroots factor.

Now, some arguments state that the grassroots factor isn't very large, and that most of the work of Linux is found in the server market.

I can pretty much destroy that idea with one word. Debian.

Fact is Debian is one of the largest server operating systems in use with the latest version of Debian, Etch, easily qualifying as one of the top three Linux based operating systems in use today. Yet, nobody sells Debian. Where do all of the installs come from them if Debian can't be sold? Okay, that is a rhetorical question as users obviously download and install Debian themselves, but that is an activity accountants cannot account for.

Going beyond Debian(pure) there are the huge grassroots factors of Mepis Linux and Ubuntu Linux, both which fall under the Debian tree. Again, accountants cannot explain the success of either of these two versions of Linux in terms of sales, since there is no direct sales data to be obtained. Mepis and Ubuntu are freely available to download, freely available to copy, and freely available to give to other users.

Debian isn't the end of the story once one starts to include community based editions of commercial distributions such as Fedora and OpenSuse. Neither of those OS's are sold in any retail channels, and I'm fairly confident that the Fedora following alone is probably larger than Ubuntu and Mepis put together.

The end result is that any report which purports that Windows is selling better than Linux ignores a crucial factor of Linux distributions.

The tracking situation isn't likely to change within the near or far future as it is. Many Linux developers are heavily concerned with user freedoms, and simply don't want to be tracked, or be in a position to enable tracking. Many Linux users are also in a similar position to avoid tracking measures, again referencing the deluge of hate posts generated when TechAlign issued serial numbers with the Mepis based Frontier Linux.

Does this mean that all Linux tracking data is effectively flawed? In short, yes.

Does this also mean that any sales reports comparing Linux and Windows server sales are also effectively flawed? The gut reaction is again, yes. The reasoned out answer is probably not. Most Linux and Windows servers are classed to deliver a particular power per watt or throughput amount. Comparing functionally equivalent Linux and Windows servers from one independent hardware vendor is of use for tracking sales data. Attempting to track the server sales overall? Isn't possible because of Debian.

In many ways, tracking Linux sales versus Windows sales is a lot like a synthetic benchmark. If you narrow the field enough, then comparisons are valid. But, when you try to look at the entire field? The effect falls apart.

Tuesday, October 23, 2007

on FutureMark's "Vantage" benchmark.

Recently I read through a break down on the FutureMark produced Vantage benchmark suite. Now, I don't like FutureMark to begin with. Their benchmarks have had multiple stability and performance issues in the past. Exampling the most recent 3DMark, 3DMark 2006, the benchmark tested Shader Model 3.0 support under Microsoft Windows Operating systems. However, on ATi's Shader Model 3.0 hardware, ShaderDMark 2006 test would crash out on looped testing. Thus, stress testing ATi Shader Model 3.0 hardware on 3DMark 2006 using the loop methods required disabling the Shader Model 3.0 tests.

FutureMark's products have generally been found to be less than accurate at their stated task, comparing computer hardware, and benchmarking overall performance. Rather, the benchmarks are generally only useful for comparing identical hardware at different clock rates in order to get a handle on the performance difference. This is because FutureMark's products are generally Synthetic Benchmarks.

HardOCP has gone into the subject of why Synthetic Benchmarks are bad to begin with, and like me, they pursue benchmarking trying to capture how applications actually perform in real usage conditions. Thing is, in my own testings with real-usage conditions between Intel Core2 and AMD Socket AM2 processors, I can't find that much difference between identically clocked hardware. In a post made in April I pointed out the following:

Something else popped into my head a while back and I figure now is a good as time as any to address it. Someone sent me an email pointing out that Intel and AMD processors were not sold in equivalent clockrates.

This is true: AMD Athlon64 processors have clock rates of 1.9ghz, 2.0ghz, 2.1, 2.2, and so on up.

Intel Conroe processors have clock rates of 1.86ghz, 2.13ghz, 2.4ghz, and 2.66ghz.

Part of this is due to the antiquated front side bus design that the Conroe processors use. The other part is that it prevents direct comparisons of Conroe processors to Athlon64 X2 processors.

Say what? Think about it... aside from the 2.4ghz entry, every other Conroe based processor isn't at an "even" stepping. If you normalize the clock rates in order to compare it to an AMD Athlon64, you either have to underclock, or overclock, one of the processors.

Cute trick isn't it.

In my own personal tests, I've never found a performance situation where a Core2 at 2.66ghz drastically outpowers a 2.6ghz Socket AM2, or approaches a 2.8ghz Socket AM2 on a regular basis.

But, when the overview pitted a 2.66ghz Core2 against a 3.0ghz Socket AM2... That's exactly what they got. The 2.66ghz outperforming or matching the 3.0ghz
Socket AM2 in every test. That's where this part comes in.
I find the results... well, questionable to begin with. In my own experience between Conroe and Socket AM2 processors I've found that they are clock for clock equal. That the Core2 consistently posted better scores than the faster clocked AMD chip? Indicates a severe problem in the benchmark.

A comment made during the article indicated that the Vantage benchmark was specifically programmed to not deny the results of previous entries in the PCMark Suite. Cue something that smells like a rat again. I thought the point of a benchmark was to find out factual performance. Not deny results that were already known. So, the benchmark suite admits that it is not actually looking for factual performance. It's looking for performance that fits already known results. I've already referenced behavior like this before, back in the Supplemental about Linux Development, when I quoted from Hogan's Heroes.

Continuing on, I then stated the following:
As to why the benchmark is like it is, and why the program is biases in favor of Intel's products, I am not likely to find out on my own. Vista is a Dead on Arrival OS. That Microsoft had to relax its restrictions on Xp sales, then abandon the restrictions all-together indicates exactly how bad Vista is doing. When major OEM's openly refer to Vista as "Windows ME II", and one makes an off the record comment that is should have been named "Microsoft Bobista", that should be an indication that OEM's are not moving Vista systems.

So, aside from its inaccurate reporting, PCMark Vantage went after the wrong market. It is not going to valuable to the reporting press, not in any sense. First hardware reviewers are going to have to get around Windows Hardware Activation (and after a string of having to call Microsoft 8 times in a row for my own GPU testing on a Vista Ultimate install), Vista was already a scratch for reviews to begin with.
The Windows ME II or Windows ME 2 comments are well known by now. The jokes about Vista following in the path of Microsoft Bob are not quite as well known, probably because an official partner of Microsoft's would probably have their job at risk if their name was attached to the line, even if they were the president or CEO of the vendor.

I have had another editorial on the back burner covering why Microsoft Windows and Microsoft Vista are rapidly removing the independent Hardware Review enthusiast. I wish I was joking that I had to call Microsoft 8 times in a row to re-activate Vista Ultimate, but I did. I finally abandoned my Vista testing because it simply wasn't worth my time to sit on the phone to reactivate Vista everytime I was swapping a graphics card out.

I wasn't finished with the initial comment though.
As far as I see it, Futuremark simply doesn't get it. DirectX is a dead API. Several game developers are having to rapidly rethink development strategy and move to OpenGL rendering to cover all platforms, while at the same time maintaining Shader Model 4 (what Microsoft calls DX10) support. Had Futuremark done their benchmarks in OpenGL, then this benchmark would have been valid for Vista releases, NT5 releases, and could easily be ported to Linux Kernel platforms. Futuremark didn't, so the benchmark is worthless from the start, even ignoring all the other factors.

What makes this worse, from my point of view, is that FutureMark has created OpenGL based tests before, for mobile platforms. But for desktop gaming where OpenGL is the only reliable cross-platform Graphics API available?

From my point of view, the use of DirectX 10 to provide Shader Model 4.0 support falls in with the already pre-determined benchmark results. FutureMark is not interested in creating reliable cross-platform absolute performance benchmarks. Over the years of the FutureMark benchmarks, such behavior has become more obvious. FutureMark bows to the highest bidder, and makes sure that their benchmarks show what advertisers want the benchmarks to show.

Most of the times when I do these breakdowns, such as with Sony or Intel, I typcially advise the company on how to repair the damage done by the product or action. I don't think I can with FutureMark. All I can do is hope that other gamers and PC users figure out that FutureMark is again being dishonest and sneaky with their benchmarks, and that as a benchmark, Vantage is not a viable choice.

HardOCP entries on Synthetic Benchmarks : #1 / #2 / #3

Sunday, October 21, 2007

It just works?

Recently I made the statement that Mepis Linux was the dominant player in the it just works market for desktop linux. A Ubuntu supporter challenged that Mepis was not the dominant player, Ubuntu was. The supporter went on to state Ubuntu's top place in reviews and online polls, as well as the deal with Dell that it was in fact the dominant player in the Desktop Linux market. If Mepis was truly the better distribution, then it would have been chosen by Dell or have topped the polls, Q.E. Duh.

So, lets examine that theory... where exactly did the perception come from that Mepis was somehow behind Ubuntu? How did Ubuntu get to be so massively popular?

Lets start with Mepis Linux.


Mepis has largely been a one man operation for about 4 years now. Warren Woodford, after a push by friends, decided to create a version of Linux that operated how he wanted to operate and to have the applications that were used most on the desktop.

Thing is though, Warren decided to be pratical about how his version of Linux would work. Warren decided to use the best tools for the job, regardless of the licensing state of the tools. So, from the first release, Mepis Linux contained closed and open licensed software. This didn't sit well with the Open License groups like Free Software Foundation. Mepis wasn't on the list of distributions to be supported by those aspiring to only promote Open Licensed Software.

Mepis's use of Closed License software opened up another can of worms early in the distributions life. Former members of the Mepis Community decided to copy Mepis Linux and resell it for profit on Ebay and through other sites, which went against the Licenses of software included in the release. The resellers were collectively busted, and rendered persona non grata in the Mepis Community. Several of these resellers then decided to extract revenge by harassing any author who wrote a review about Mepis Linux, or any user who commented about Mepis Linux in an online forum. The fallout from the F.U.D. attacks meant a clear listing of the applicable licenses at

Despite the list of licenses, even as recently as April, June, and July 2007 the F.U.D. attacks were still being witnessed in locations such as the PLinuxOS forums, and commented on at MepisLovers: Thread 1, Thread 2, PCLinuxOS thread.

The concentrated F.U.D. attack against Mepis wasn't the only problem Mepis Linux suffered with when it came to reviewers. Warren Woodford reportedly went against suggestions made by writers for the popular website, and a select few decided to deliberately avoid writing about Mepis Linux or running news stories on Mepis Linux.

Other writers went out of their way to write anti-Mepis stories, such as a story concerning Mepis Linux having serial numbers (Mepis Pro and Frontier Mepis), conveniently ignoring that professional users buying from Red Hat often had serial numbers to keep track of their users accounts.

Thing is, Mepis Linux has gone out of it's way to provide end users with Legally Licensed products that satisfy both the FSF and owners of the license. Mepis Linux is one of the few distributions to support the closed licensed MP3 format off the disc, coming up with a solution that was palatable to both the FSF and Fraunhofer.

Mepis Linux also went to bat against the FSF when it was revealed that distributions using the source code from another distribution were expected to mirror the source code and packages, even if they did not modify the source code or packages.



Ubuntu took a different approach to creating a distribution than Mepis. Ubuntu has an official policy that they will use open-l
icensed Software where-ever possible, and only closed-licensed software where there was no other alternative. In practice many Ubuntu users found out the hardway that the reason it works in Mepis is that Mepis used the best tools for the job. Not the third or fourth best down the line. In order to achieve the same functionality as Mepis, many Ubuntu users were having to add the same closed-licensed drivers and products in after installing Ubuntu, that Mepis shipped with by default.

That didn't matter to a lot of the press which praised Ubuntu for it's handling and promotion of open-licensed software, although the reality was that Ubuntu's use of closed-licensed software was nearly identical to the way Mepis used closed-software license. The only real difference was where Ubuntu and Mepis drew the line at where it was appropriate to use the closed licensed software.

Ubuntu's rapid rise to fame came not only at the hands of reporters swayed by the sweet sounding nature of Ubuntu's promises, but at the hands of Ubuntu's forum community. Ubuntu forum members encoruaged each other to flood webpages and polls about using Linux. If another Linux was reviewed forum members were encouraged to flood authors with responses about how Ubuntu did something better, faster, and so on.

Ubuntu simply was better at mobilizing it's users to respond than other distributions. Ubuntu was also better at figuring out how to exploit various systems. Many of the major Distrowatch exploits created in the past 2 years came from Ubuntu's forum members.

To other distributions, it was unthinkable to exploit how Distrowatch was tracking interest in various distributions. It simply wasn't polite for a moderator to order people to go after an author for writing a less than stellar review.

The final part of Ubuntu's rise to success was it's founder, Mark Shuttleworth. Mark is the exact opposite of Warren. Trying to get details out of Warren about what is going on is like pulling teeth out of a non-tranquilized adult Polar Bear. Getting details out of Mark Shuttlworth is like shooting fish in a barrel... with an Abrams tank.

Thing is Mark Shuttleworth has two things that other distribution founders generally haven't had. A: Lots of Money. B: Lots of charisma.

Mark Shuttleworth has a policy that he will never ever say anything degrading, mean, rude, obscene, or such against any other distribution, person, or company. He is a consummate showman, who uses his wealth in ways other people haven't. His status as an amateur astronaut was a perfect launching point to promote Linux to people who never would have looked at Linux. There are lots of people who are very interested in the stars and space travel that soak up everybit of news that comes along. Mark Shuttleworth used that to his advantage to promote Ubuntu.

Like the news media's handling of Ubuntu's status with open and closed licensed software, and the status of the forums for having used less than ethical methods of promotion, behind the showman is a much darker picture.

Thing is, I side with several Debian developers that Mark Shuttleworth is trying to take over Debian. I've heard from more than one distribution maintainer that Mark Shuttleworth has told them to get about the Ubuntu train or there won't be a place for them in his future of Linux. As Bob Crane used to say as Colonel Hogan, I hear lots of things. Like my friends painting a completely different private picture of Mark Shuttleworth than I see myself.


Thing is though, Mark did place Ubuntu Linux in a place where a Linux did need to be. If it weren't for the people sitting at phones answering questions, if it weren't for the money behind Ubuntu, things like the Dell deal probably would not have happened.

Mepis, however, has a long history of stepping on toes and being the recipient of attacks, not the initiator of attacks. Personally though, I'm glad Warren stuck to his guns and didn't back down when the F.U.D. attacks hit, when the reporters went away, and when the Free Software Foundation got way out of line. And whether or not Ubuntu likes it, it's still way behind Mepis, and always will be unless it steps up and closes the gaps on it's software choices.

Friday, October 19, 2007

What's the deal with Ubuntu's relationship to Debian?

Hi, Saist,

Wondering if you would care to comment a bit on this topic. Someone mentioned that "ubuntu is based on the unstable branch", and on that note, I found a line at Wikipedia saying that "Ubuntu packages have generally been based on packages from Debian's unstable branch..."

My interpretation is that Ubuntu drew on packages from "Sid" but that this is not exactly the same as saying "Ubuntu is based on the unstable branch."

What's your take, if you don't mind and if you have the time?

Both are true and false at the same time.

When Ubuntu says they took a snapshot of a Debian Unstable repository, they took the snapshot of the then current Debian Unstable Source repository regardless of whatever the code name for that repository was.

What Ubuntu does is take a snapshot of a Debian Source Repository, then use the Debian sources with their own proprietary (although open-licensed) tool chain in order to actually build the programs and final distribution.

Ubuntu's process of building a new distribution involves modifying the programs and applications in the snapshot. The modifications are achieved by a number of methods. Techniques seen before include updating to newer versions of programs with different external sources; regressing programs to previous versions with previous libraries to achieve stability with applications known to have existing bugs; as well as changing default configuration values for installation and boot detection.

The result is that more recent Ubuntu builds have ceased to be binary compatible with Debian packages, becoming a true fork with changes to the underlying source code.

Currently Ubuntu's latest builds have about as much in common with the original Debian source as Red Hat Linux. Yes. It is still Linux, it is still open-licensed software, and if you work the dependencies and kernel versions right everything should still technically work. I just don't know of anybody that raids the Red Hat repositories on a regular basis to fill out their debian installs, and I'm pretty sure Fedora developers aren't pulling from Debian sources.


In relation to Mepis Linux, this isn't to say that Mepis does not do the same thing. The development process is very much the same. Part of Mepis's domination of the "it just works" Linux Desktop market is that Warren does follow many of the above processes. Warren does use sources outside of the original repository. Warren does use proprietary kernel modules. Warren does modify the boot configuration and install settings.

The difference is that Warren makes it a point that packages must be binary compatible with with the source distribution. A package built to Mepis must be functional in the original source distribution on a binary level.

Ubuntu makes no, to little, attempt at ensuring binary compatibility with their sources. If anything, from Dapper to Gutsy, Ubuntu has deliberately broken Binary compatiblity with the original source distribution.

This was a very real issue that Mepis ran into with OpenOffice on Mepis 6.5, as well as other upgrades. The Ubuntu Tool chain simply wouldn't allow for building of packages that would be cross compatible, if the packages would build at all.

Cross binary compatibility is going to be a major factor in the upcoming Mepis Community Repository. The objective is for users of Debian Etch to be able to enable the Community Repository (or mirror it), and pull updated packages from Mepis and have them just work without having to recompile.

Monday, October 15, 2007

Linux Gaming, lets run the numbers.

One of the questions I see asked a lot in commercial gaming circles is why I continuously push developers on releasing native Linux clients. A Sony Online Entertainment official outright said the market share is not there, so as a company focused on marketshare, they were not going to release a Linux Client.

So, lets stop for a minute and actually run the number we do know about.

Nobody knows the true number of Linux Desktop users. Desktop Linux users are people like me who use a Linux Desktop to complete their daily routines such as writing letters, doing taxes, keeping track of finances, sending mail, and so forth. According to Novell there are at a minimum 30 million Linux Desktop users available. On the face of it Novell's numbers seem to jive with the known interest in Fedora, Suse, Mepis, Debian, and other distributions. If anything, Novell's number is probably a low guesstimate.

So, lets play it safe and say that there are 30 million Desktop Linux users out there.

Now, can anybody tell me how many units a game needs to sell in order to be profitable?

The quick answer is that the profit number changes per game. High End titles like Epic Software's Gears of War was doing pretty good to hit 3 million sales in 10 weeks. Final Fantasy XI for the Xbox 360 was doing pretty good when it hit 100,000 users in beta.

The 900 lb Gorilla in the MMO market is World of Warcraft with over 9 million registered accounts. Other MMO series such as City of Heroes, Everquest, Lineage, and Guild Wars are happy with only a few million each. Ultima Online is still going and it only has a user base best described in tens of thousands.

Most console sales are doing quite well if they pass 5million systems. With average attachment rates of 66-70% of consoles getting a single game, sales of
a single game around 3-4million is considered a good number of sales.

Stop and look at those numbers again. 30 million Desktop Linux Users. 3 million game sales.

Suddenly the Linux installed base isn't that insignificant. The installed base is far greater than that of most consoles at the end of their life spans. In the case of Gears of War, that's only a 10% attachment rate.

With dominate MMO World of Warcraft, that's still less than 30% of the available user base.

For a non- AAA First Class title? Where sales are generally expected to be between 500,000 and 2 million over the course of the game? That's a very small percentage.

So, lets put two and two together. Who exactly is running Linux on the desktop?


Think about it. Your average Desktop Linux user is probably someone whose known for being tech proficient. Your average hard-core gamer is also known for being tech proficient.

Somebody on Anandtech / DailyTech comments pointed out once that purchases of high end graphics cards ($500+) were probably running both Windows and Linux at the same time. The theory was carried out by reports from both Nvidia and ATi. ATi ran into a public relations nightmare as it took flack for not having operational x1x00 series Linux drivers for over 6 months after the x1x00 series launched. RadeonHD also faced severe backlashes for it's lack of Linux drivers. Many Linux uses recommended buying Nvidia cards, despite the broken drivers, because at least drivers were available.

With these factors in mind, why hasn't Linux gaming taken off?

It has. Just... not in a way that can be actively tracked. The fact is, W.I.N.E. technologies are designed so that applications don't know they are not being run through a non-windows environment. If Cedega works properly, as far as any game is concerned, it is running through the version of Windows that is specified in the configuration.

NCSoft doesn't know who is running their games on Linux because of this. Blizzard also doesn't know who is running their games on Linux because of this. The same goes for any other game out there run through W.I.N.E. technologies.

The sales result is that a sale is a sale. To a game that runs under W.I.N.E. technologies, if any tracking software is installed, all that tracking software will see is Windows. Now, it might note some abnormalities, such as out-of-date drivers on City of Hereos, but the real operating system isn't reported.

Native Linux clients are in a similar position. Sales of Unreal Tournament 2003 and 2004 included the Linux client on the disc. Unreal Tournament 2007 also has been reported to include the Linux client on the disc, although more recent reports now indicate that the Linux client may not be on the disc. In the cases of retail sales for the Unreal Series, there is little to nothing to tell the publisher what platform the game got put on. If the game client works properly, the server won't care.

Eve Online also recently put a Linux Client out in conjunction with Transgaming. However, if the Linux Client works properly, there won't be any difference between it and the Windows Client. Because the client can also be downloaded from there isn't any difference between an Eve Online Linux sale and an Eve Online Apple Unix or Microsoft Windows sale.

Even when the game client is released separate of the game itself, tracking game usage is next to impossible. Consider UT99 and the various Quake games from IDsoft. The installer to run the games is generally made available from the main site, in the case of IDsoft the Linux installer can come from their FTP server. Yet, if you go around to just about reliable hosting service, such as FilePlanet, FileCloud, or the like, you'll find the same files uploaded in those locations as well.

Now, having played all of these on Linux and having written install guides as well, I'm fairly safe in saying that I couldn't tell you who on the network has the Windows install of UT99 or a Quake game, or who has the Linux install. Except when the system crashes. That is normally the Windows install.

The real problem with Linux retail gaming is that accountants don't know how to absorb or collate this data into something usable. With Microsoft Windows and Apple Unix it is a fairly trivial process to collate the end user data. Windows is something accountants understand, Linux and Linux gaming is something accountants do not understand.

Problem is, after posting this, I have little doubt that I'm going to get several angry emails from current accountants that they no longer can remain ignorant about Linux numbers and ratios.

Saturday, October 13, 2007

Government isn't the answer.

I'm not sure where to really start on this one. Essentially the idea has been proposed that in order to fix the technology patent problems in the US that there needs to be a significant change in leadership. While I agree with the concept that leadership in the US needs to be changed, determining who it needs to be changed to is perhaps a difficult task... or is it?

My personal opinion is that I don't want the government to get involved in technology, business or other matters. I want less government control, less government oversight, less government overall. Why?

Well, lets put it in a practical examples. Name any industry that a Government has entered that has improved.

Health Care? From what I read coming out of Europe that's a living joke. I have choked listening to some of the stories coming from people living in Government sponsored health care nations. Sure, it looks nice on paper, but I really don't want my taxes to go to fixing somebody else's leg broken during a game of American Football.

Military? Oh, I'm fairly sure that the soldiers killed in lightly armored humvees gladly switched over from a heavily armored tank. That went over real well. Not to mention shortages on Body Armor and weapons development.

Transportation? I live in Georgia. We still have places that were marked for paved roads over 4 decades ago that haven't gotten paved roads yet. Oh yeah, Government has done a real good job there.

National Endowments of the Arts? Somebody placing a cross in a bowl of urine and calling it art gets a grant. Do I really need to spell out what's wrong with this picture?

The fact is, anytime a Government gets involved in anything, any industry, any practice, and job, the results tend to range from bad to horrible. So, why is that? Well, lets think about who is IN a Government modeled on the US Congress.

The point of a representative government is that a large group of people elects one person to represent their beliefs, jobs, industry, and everything else to another person who is also elected by a large group of people. In some aspects it is a Government by Proxy. I elect an individual and I expect that individual to represent me to the best of their abilities. The hoped for goal is that large numbers of people are represented by a small number of manageable people.

It used to be that Representative elects were normal citizens. Farmers, blacksmiths, tailors, and so on. People with real 9-5 jobs who went out and spoke for other people with real 9-5 jobs.

Now, there is a semi-new breed of Representative, a career politician. Somebody whose family had money or influence, who has never held a real 9-5 job, never served time in the military defending their country, and expects to be able to represent people who have completed such actions and events. The very real problem is this, career politicians are the last people that should be representing any populace in any situation.

That is why Governments tend to do so poorly in any industry that can be named. A car repairman by trade shouldn't be telling a Doctor how to best treat a sick patient. A painter by trade shouldn't be telling the Military how to build armored vehicles. Somebody with no real world job experience shouldn't be instructing or determining how people with real world job experience should be managed. Yet, that is exactly what happens today. Some blathering idiot gets up, makes a speech, gets a letter signed by 40+ like minded individuals, all whom are uniquely unequipped to be in the leadership positions they are in.

What is needed then is a return to the old representative style. Common men and women who force themselves out of bed at the crack of dawn, work their rear ends off for 8 hours, come home, work their ends off some more, then plop into bed exhausted... those are the people that need to be leading. People who actually have done something with themselves besides pose for photographs.


Now, I've gone over this before in an earlier blog post asking people to follow the money and trail of thought from RIAA and MPAA to existing political candidates and parties. Going back to that trail of thought, I'd encourage people to individually follow up on who is aggressively abusing the patent system by following the money.

We already know that Acacia is behind IP Innovations lawsuit against Linux companies Red Hat and Novell. We also already know that 3 key Microsoft executives are running Acacia. We also know that Microsoft is hand in hand with helping the RIAA and MPAA aggressively rework the definition of fair use and content ownership. We also know that the RIAA and MPAA are mostly made of Liberal Leftist Democrats who support Hillary Clinton.

Oh yes. I just spelled it out.

I've said this before, and I'll say it again. Anybody who honestly believes that a Democratic leadership is going to result in any change that is beneficial to Linux, free rights, and fair use? Is
A: Seriously Deluded
B: Brain damaged.
Now, I'm not saying that voting Republican all the way is going to fix the attack on fair usage and freedom of rights. This goes back to the earlier point that career politicians are not the answer. If anything, career politicians will make the problem worse regardless of political affiliation. As I see it, the lack of connection between real world experience and the political realm many Republicans are swayed by the cute sounding arguments made by members of the RIAA and MPAA, as well as arguments made by Microsoft's extensive lobbying staff.

At worst a Republican leadership in the White House and both Houses of Congress should mean that nothing gets worse. At best, it could be the start of getting real people back into the Chain of Command, people who haven't been handed life on a silver platter.

Friday, October 12, 2007

G.O.T. : Grandmother of Technology?

One of the main problems Linux has with desktop adoption today is a lack of face, a problem Nintendo itself had. For years Nintendo was known for it's mascots Mario, Zelda, Link, Samus, Bowser, Gannadorf, and so on, but when pressed to answer on who exactly was running Nintendo? Not many people could name a real person. A few years ago Nintendo hired Reggie Fils-Aime to be their face, something BrandWeek talked about when it handed Reggie the Grand Marketer of the Year award.

In the same way, Linux needs a human face. Yes, the Penguin is cute and well known. Yes KDE's Konqi is well known... to fans of KDE. In a crowd of people, I highly doubt I could single Linus T. out, or Andrew Morton, or Ingo, or any other well known kernel developer whose emails I see almost daily on Kerneltrap.

Novell, I think, had the right idea. I like their mockeries of Apple's PC vs. Mac ads, especially the first one. The Apple ads are well designed to convey the idea of who uses the computers. For the Mac there is a 20-30 something guy who comes across as a little smarmy, but likeable, and for PC there is a 40-50 something guy who comes across as book smart, but clueless. Novell introduced Linux... as a cute 20-30 something female. Can't say I liked the Gnome jacket she was given, but that's my own bias there.

That, I think, is what Linux needs, a face to go alongside the OS from the big name vendors. It doesn't have to be a cute 20-30 something female though. It could be anybody, even a Grandmother. In some cases it might even be great to have the face be a grandmother. How often has "Mother approved" been used in TV and radio advertisments to sell products? Some sort of Grandmother of Technology who simply appears at the end of a Linux commercial saying "I approve!"

Pipe dream? Ludicrous? Probably.

No duh Gabe.
"I think [PS3 is] a waste of everybody's time," he said. "Investing in the Cell, investing in the SPE gives you no long-term benefits."

"There's nothing there that you're going to apply to anything else. You're not going to gain anything except a hatred of the architecture they've created. I don't think they're going to make money off their box. I don't think it's a good solution," he continued.

-Gabe Newell

On one hand the argument does make a lot of sense. Gabe is of course talking about the Playstation3's hardware. While it is powerful, it is also complex.

What grates on my nerves is that there is another good reason why Gabe and Valve have such problems developing for the Playstation3.

The Playstation3 uses Linux and OpenGL. Vavle has deliberately avoided developing for either technology and only recently posted a position for an engineer to help port games to Linux. There, is thus, from my point of a view, a decided lack of people within Valve who are equipped to work with the Playstation3 to begin with.

To try to put this in perspective, this is like going to a motorcycle repairman for a problem with your car. Yes, both vehicles have wheels, and both have engines. Typically cars and motorcycles both have seats and both have sets of gauges to tell about the internal performance of the engine.

They are, however, two completely different machines with very little in common.

The same holds true between the Playstation3 and the Windows based WindowsXP and Vista, and the WinNT5 kernel based Xbox 360. It takes a different skillset to work with each product, and from my viewpoint, Valves hiring history means they don't have very many people with the skillset to work on the Playstation3.

So it's a no duh argument from Gabe Newell. Sure, the Playstation3 is complex, but there'd be a lot more wiggle room to make complaints if Valve games and Steam already were out on Linux. With no Linux client? Gabe really doesn't have much room to be speaking on the Playstation3.

Saturday, October 06, 2007

Why DirectX is a multi-million dollar mistake

This is a follow-up to a recent post where I copied in contents from a post from the Cedega forums. While I've covered this topic before with a post about an ExtremeTech interview, I think I need to give this topic it's own post.

Okay, the idea is that Microsoft's DirectX is not a viable choice for developers as using DirectX will cost publishers millions of dollars to code around or with DirectX. First, we do need to set some history for what DirectX is, and what DirectX does.

DirectX, as I understand it, is the universal name for Microsoft's interconnection technologies. With Disk Operating System (DOS) items like printers, sound cards, and graphics cards had to be explicitly coded into programs. So if you went out and bought a Voodoo Grpahics card... you could only use applications that specifically had Voodoo Support coded in. If you bought an HP printer, you could only use the printer with applications that supported the printer directly.

Back the DOS days users had to carefully match software and hardware.

To their credit Microsoft was one of the first vendors, if not the first vendor, to try to break the hardware and software matchup. The Direct API was one of the first attempts at creating an underlying bridge between the hardware and the software. Hardware such as printers and graphics cards talked to the Direct Protocols, which in turn talked to the OS. Software talked to the Direct Protocols, which in turn talked to the hardware.

Thus users were freed from having to match specific hardware with specific software. That HP printer not working? Replace it with a Cannon.

As time went on the DirectX graphics portion took a greater emphasis. Currently updates to DirectX generally emphasize graphics support, although running the command dxdiag shows that the Direct protocols have a hand in Networking, Sound, and I/O operations.

The only competitor DirectX has in graphics is the OpenGL standard. OpenGL was originally created by SGI if memory serves, and is the default rendering API used in high level movie studios. Remember the computer generated special effects in Star Wars? That's OpenGL. Seen a Pixar movie? That's OpenGL. Seen a Dreamworks Animation movie? That's OpenGL.

Played DoomIII? That's OpenGL. Played Quake4? That's OpenGL. Played a Playstation3 game? That's OpenGL. Played a Wii game? Chances are it was built with OpenGL shaders. Played Metroid Prime? Yes, that is reported to be OpenGL as well.

So, DirectX is a competent product... and OpenGL is a competent product... why is DirectX development costing developers and publishers millions of dollars... but OpenGL is not?

Answering that question involves having to take a look at the entire graphics market, and forgetting commodity x86 and x86-64 hardware.

Fact is, DirectX graphics only work on Microsoft platforms which are a decided minority outside of x86 and x86-64 commodity hardware. Microsoft has less than 2% of the Cellphone / smartphone market. Microsoft based PDA's have similar low percentages of market penetration. DirectX only works on 2 game consoles.

DirectX 10, which offers Shader Model 4.0 support is only supported on the Microsoft Vista release.

So, if developers or publishers build with DirectX they lock themselves into a very limited number of platforms. This generally is not a problem for specific game developers like Bungie who have little, or no, intention of taking their products to other platforms.

But for major publishers like EA, Activision, and Midway? DirectX is a costly maneuver.

Thing is, OpenGL shaders can be run across all existing major platforms, from Cellphones and PDA's up to massively powerful home consoles like Playstation3 and Xbox 360. As I read existing Kronos's documentation on the OpenGL API, it seems to be a fairly trivial task to convert from Shader Model 2.0 to 3.0 to 4.0.

At first glance this does seem like a fool argument. Unreal Tournament III is not going to on CellPhone/PDA hardware, it needs something with some serious horsepowers. Same thing used to be said about Doom... yet I'm seeing Doom pop up on Cell Phones now. IDSoftware is also examining turning Quake 3 Arena into a browser accessed game.

So... lets pose this question. In 8 or 9 years... what's to say that UT3 will be as UT'99 is today? Can you imagine a version of UT3 ported to a successor of the DS? Without having to re-write the graphics code?

The cost of using DirectX over OpenGL is highlighted even more by the release of Microsoft Vista. Consumers who bought graphics cards with Shader Model 4 support have been continuously told that Shader Model 4 visuals are only available on Microsoft Vista.

Untrue. Software written to Shader Model 4 using OpenGL... will have the same visual presentation on any system that supports OpenGL.

The result is that a developer and publisher only have to write code once, to the OpenGL API, and the OpenGL Compiler written into the graphics driver will take care of the rest.

The added advantage to using OpenGL as the API over DirectX is that publishers are released from Microsoft platform releases only. Hiring developers to port to other platforms like Playstation3 Linux, Wii, x86 / x86-64 Linux, and Apple OSX is no longer a necessity. One code base, one release.

That's where the millions of dollars of savings comes in. One of the major reasons why x86 / x86-64 Linux gaming hasn't taken off is the cost in porting graphics code over. The actual game engine is generally ported over, if not written natively since Linux offers a much better server platform.

But... by using OpenGL... publishers and developers can remove the headaches of supporting Microsoft Vista and Windows Xp... while at the same time opening up a userbase of over 30million desktop users.

So... as I said, DirectX is a multi-million dollar mistake developers and publishers cannot be making.


Now... I will say this. I would not mind in the least Microsoft punching me in the nose by releasing DirectX 9 under GLPv2... which would easily fix the whole multi-platform issue.

Friday, October 05, 2007

Figuring out the date of CE.ME.NT

Someone recently posted a link in a forum that lead to a site making a Windows CE.ME.NT. joke with Microsoft Vista. Well, just how long have we been seeing Windows CE.ME.NT jokes?

Well, Windows NT 4.0 was released on July 29 in 1996, so that's our starting point for the NT jokes.

Windows ME had already been named as early as 1997 under the tag Windows Millennium Edition. Originally Windows ME was supposed to be a consumer oriented (read : pretty) version of Windows NT5. In 1999 Microsoft realized that their re-spin of Windows 2000 Professional was not going to make it out the door in 2000. Another revision of the Windows 9x branch with the new media components from the existing Windows ME project was planned. The new project was given the name Windows ME as it could be out on the market by 2000.

Sort of a side FYI the original Windows ME project made it out the doors in 2001 as Windows Xp. Windows Xp Service Pack 1 and Windows 2000 Professional Service Pack 4 are the exact same operating system having been synced up for security and development purposes. Windows 2000 Professional Service Pack 5 was supposed to keep Win2k Pro in line with Windows Xp SP2. However, Microsoft canceled Service Pack 5 to try and drive sales of Windows XP licenses. Didn't work from my viewpoint.

Anyways, so we have a date of 1997 set for when ME and NT were in place... so how about CE?

I can't find an accurate launch date for when Windows CE 1.0 was released, but I can give the launch date of the Sega Dreamcast. It appeared on November 27 in 1998 over in Japan. One of the high points of the Dreamcast for developers was the availability of a Windows CE development kit alongside Sega's development kit. Since Sega had been working on the console since at least 1996, we can presume that Windows CE was around in 1996 and 1997.

So, both Windows CE and NT were probably around before 1997. The Millennium Edition codename which appeared in 1997 would mean that the first Windows CE.ME.NT. jokes could appeared that year.

And yes. I spent way too much time thinking about this.

Congrats Newegg. You've managed to give me a bad experience

Just a small sample picture of the first legitimately bad experience I've had from

Microsoft and Bungie Split is a BAD thing.

As it is now official that Microsoft and Bungie are splitting ways I'm starting to see editorials and analysis popping up trying to spin the split as a good thing.

No. It isn't.

The split indicates exactly how bad Halo has been in terms of long term sales for Microsoft. Yes, Halo tends to turn a huge sale over its first few weeks of sales, but what about long term sales? What about sales to other platforms besides the Xbox?

thing is, Halo has flopped everywhere else but on it's initial launch on it's initial platforms. Halo 1 for Windows was a critical flop and a retail flop, and the restriction of Halo 2 to Microsoft Vista caused it's release to flop in retail sales as well.

On Nintendo platforms it is not uncommon to see titles like Mario, Metroid, and Zelda still selling in top 20, or even top 10, lists 6 months, 8 months, even over a calender year past release. Considering the genuine phenomenon that is Pocket Monsters, or Pokemon.

Sony isn't a stranger to the long term sales. Consumer demand for Ratchet has sent Insomniac back to the drawing board for another Ratchet Title, and the recent departure of a high level exec for Naughty Dog to Ready at Dawn re-kindled questioning about the status of another Jak and Daxter title.

So, No. Halo is not a phenomenon. Halo is a hyped product, end of story. It is an average title. There is no Halo Effect. There is a Hype Effect.

Now, if Halo had been the genuine phenomenon that Microsoft has paid press to portray the product as, then there isn't any way that a split would occur. Microsoft could easily make it worth Bungie's while to remain on direct payroll.

As I see it, Microsoft is pulling the same stunt with Bungie that Nintendo pulled. Remember when Nintendo dumped Rareware? How many titles had been profitable from Rareware on the Gamecube? How many titles had been delivered on time?

Answer? Only one title was delivered by Rareware, and it came late.

Nintendo dumped Rareware like a bad habit, but also left open the licenses so that Rareware could continue producing content for Nintendo systems.

Microsoft, in the same way, is pushing Bungie out the door. Halo isn't a long term viable franchise for Microsoft, and Microsoft knows that. Microsoft, therefor, is distancing itself from Bungie... but in the same way Nintendo distanced itself from Rareware. Sure, if a good product somehow does come out of Bungie, Microsoft doesn't want to be completely out of the loop...

But by loosing direct day to day control of Bungie, Microsoft has given the company enough room to hang itself.

Speaking for myself, I await the average gamers revulsion when Bungie starts releasing products on their own.

Nintendo's Wiimote Jacket : Playing Nice?

Recently Nintendo has extended an offer to ship to all Wii owners, free of charge to those owners, plastic jackets for their Wiimote's. Given that the Wii is fast approaching it's first birthday, why would Nintendo risk a possibly cost of $17.4 million or greater in order to send the jackets out?

Is Nintendo going after the brisk market of Wii controller gloves that BD & A currently
sells? Not likely, BD & A's gloves come with a $10 price tag. The official Wiimote jackets are also a rather ugly clear plastic, or at least the pictures of the official Wiimote jackets are.

Right off hand the move is a great Public Relations Move. Nintendo has come under fire repeatedly over the past several months from quoted
consumer interest groups worrying about the safety of the Wii's motion based hardware. There have been quite a few comic artists and news agencies that have poked fun at the possibility of putting a Wiimote through a TV screen. As best as I can tell, none of those comic artists or news agencies have ever laid hands on a video game console before. There are hundreds of games out there that will make players want to put a controller through a TV, and many that have.

So, the Wiimote Jacket is a token offering, that Nintendo is somehow concerned with user safety. It won't shut the
consumer interest groups up, but it is my opinion that anything short of duct tape won't shut some of the consumer interest groups up.

The Wiimote jacket is also a good move on Nintendo's finances. As is well known, everything Nintendo does right now is calculated to drive a profit. From the first Wii unit shipped out, Nintendo was earning a profit on the unit.

The effective cost of the plastics jackets probably fits within the calculated cost difference between the Wii's actual hardware price coupled with taxes and import costs, and the general selling price (Australia need not apply).

Thus, while the Wiimote Jacket offer has the potential to put a dent in Nintendo's finances, even accounting for the past 10 to 11 months of sale, the calculated profit from each console should be enough to cover jacket production and shipping, never minding the profits from the Nintendo DS system, and the profits from games.

So, the move is a good PR move for Nintendo, it's a smart financial move, and it's not really going after the 3rd party market for controller gloves. With the only pictures of the jacket showing clear plastic versions, colorized gloves are still attractive in the eyes of the consumer looking to tell their Wiimotes apart.

Wednesday, October 03, 2007

No Intel Does NOT own physics.

One of the major events in gaming recently involved Intel buying out Havok. Havok is a company that specializes in computational physics, and is widely used in commercial retail games. Havok generally charges thousands of dollars for their physics software, and it is a competent solution. The only direct competitor Havok has is AGEIA, which is a hardware manufacturer. AGEIA's physics software is gratis, they make their money off of selling dedicated physics hardware.

Where this gets interesting is Physics on GPU. Both Nvidia and AMD/ATi had projects underway using HavokFX, a version of the Havok physics engine accelerated by GPU computations. Several doom and gloom analysts are worried that Intel's purchase of Havok will make it difficult for Nvidia and AMD/ATi to pursue Physics on GPU, and that Intel has already locked in the market for their upcoming add-in graphics cards. If you want Physics on GPU, you'll have to go with Intel. If you want the best Havok physics... you'll have to buy Intel.

Well, I've already been over Intel's chances of making a decent graphics card before. Graphics have been ancillary to Intel for literal years, and I don't think Intel has the development staff or engineers to compete with AMD/ATi and Nvidia.

That, however, isn't the gaping hole in the idea that Intel owns physics.

That... is the gaping whole in the idea that Intel owns physics. AMD is opening up the specifications for how their graphics cards actually work. Now, if you know how a graphics card works, and if you have access to the BIOS/Firmware, which in AMD's case developers do have access, what's to stop using the Graphics card for items other than Graphics?

As I see it, Intel's purchase of Havok was in direct retaliation for AMD opening up the specifications and BIOS for AMD graphics cards. Intel sees the scenario I am about to lay out and did what they could to upset it.

With the specifications for the graphics card in hand, what is to stop AGEIA from optimizing their Physics Software Developer Kit to leverage AMD/ATi hardware? Presuming that Nvidia wakes up and opens their specifications and BIOS as well... what's to stop AGEIA from leveraging Nvidia hardware?

The following is something that I sent ATi last year... in 2006. This was sent before the merger between AMD and ATi was announced... now... after the merger and after the opening up of the specifications and the BIOS... these questions take on a whole new light.

been reading through the reports from Computex about ATi and the Asymmetric Physics Processing and I was wondering... In the Hypertransport Specification 3.0, part of the specification dictates an external cable link which allows for boxes outside the computer to connect directly into the Hypertransport internal links. Also, AMD recently announced Torrenza, an internal connection technology to allow internal devices direct access to Hypertransport links on the motherboard itself.

Some of the rumblings I've seen from Torrenza specifically involved AGEIA using the Torrenza platform to connect their physics processor to the system. Would it be possible to utilize a re-purposed R5xx GPU in the same manner and directly connect the GPU right into Hypertransport for direct access to the system? If that is possible, would ATi also consider utilizing re-purposed GPU's and selling them in external boxes using HyperTransport Link cables? While I realize that external HyperTransport interfaces are not yet in production, I also must note that motherboards using 3 PCI x16 slots are not very widespread either, so both types of connections have time to grow.

The argument I would make to manufacturers in reference to including an external HyperTransport cable link is that most users are not comfortable with opening their computer up and adding components. While I have little doubt that Asymmetric Physics Processing with Crossfire would sell very well in the high end gaming segment, thats only, what? 5%? Maybe less? of the available market that is actually sold to? I would think that these users, if given the opportunity to buy an external box that plugs into the computer and immediately creates a performance difference, would probably buy the product. Re-purposing older x1600 gpu's, as mentioned in the tech brief, to external HyperTransport boxes might be a revenue stream that I don't think I've seen any manufacturer or hardware developer actively look at for the home market. I think this would also be of interest to the Hometheater market, as well as student markets, which may not have the physical space to put a full tower. Speaking for myself, I know there's no way I could fit 2 double slot graphics cards and a 3rd card into a MicroATX case.

I've also seen some rumblings going around about using GPU's to process sound commands, which leads me to pose this question: Presuming that GPU enabled physics take off and are must-haves for the mass user. Lets say that both AGEIA and Havok get behind GPU enabled physics. Lets say that in one years time everybody is using GPU enabled physics or needs to use GPU enabled physics, or have a dedicated PPU inside the case. AGEIA partially tackled the problem of getting hardware physics to consumers by releasing on the aging PCI slot. Granted, it's slow, really, really, really slow, but just about every computer built and shipped today has a standard PCI slot. Has ATi considered allowing manufacturers like PowerColor and Sapphire to release PCI versions of current cards that could be re-purposed for physics use over the PCI bus? Has ATi considered building versions of Radeon Xpress that would support an AGP slot for the expressed purpose of using the AGP card to accelerate physics?

Are there also considerations to release Physics ONLY cards that strip items like the RAMDACS, display ports, and other items from the card needed for the visual display itself? In such a scenario the physics only cards could sell for less than their visually enabled cousins. That would help clear GPU inventory, and GPU's that didn't make the cut to be used in graphics could be binned to be put into a physics only card. If a physics only card has been considered, going back to the question about processing sound commands, has ATi considered releasing cards that have two re-purposed GPU units? For example, an x1600 unit stripped of visual display for physics, and an x300 GPU for processing sound? Or have there been considerations to add Sound support into Asymmetric Physics Processing and have the Physics GPU also handle sound processing?

While I understand that most of these questions are probably going to be met with the typical "I can't comment on that at this time," part of me hopes that at least some of these items have already been circulated through ATi.

Okay... so what would prevent AGEIA from using the GPU specifications to port their physics SDK? What would prevent PowerColor, Sapphire, XFX, Leadtek, or any other hardware vendor from re-purposing older chips onto stripped cards just for physics acceleration?

No. Intel doesn't own physics, and since I know that ATi has had this brought up before, I do not doubt that somebody in AMD/ATi is already thinking about this.

*Gives Illiad a GREAT BIG HUG*

But overall I'd rate Halo 3 as just okay. Nothing Special.

Major. Major. Major PROPS to Illiad for having the guts to post that. But, then again, I've been saying that anybody who has played Metroid Prime 3 is wondering what the big fuss over Halo is about. Its sorta of... I don't want to say rewarding... but refreshing to see other people finally stand up and declare that Halo is just an average game. It's competent yes... but it ain't Half Life... it ain't Unreal... it ain't Quake or Doom, it certainly ain't Metroid... and Bungie is never going to have the development team to stand up to Retro Studios, Epic, and IDsoftware.

Anyways, while all of the people who bought Halo 3 are wondering what do they do now... I'm going to go back and play Metroid Prime 3. It actually does something new.

Tuesday, October 02, 2007

Cedega : try to NOT piss off your community

The following is a modified version of a post run on the Cedega forums. The original poster ran a post that Cedega was losing it's paying subscribers, and most of his arguments were based on Blizzard's World of Warcraft


I find myself agreeing in some aspects. While I personally find World of Warcraft to be watered-down and mass marketed to Ubuntu and back, it is the 900lb gorilla that opened a lot of doors for real Linux usage among desktop users.

Part of me feels a bit bitter about City of Heroes / Villains. The game is supposedly supported by the Cedega dev team, but it's been ages since the patch that broke Win2k compatibility and there hasn't been any public word or documentation on a fix for the game.

Also tied to CoH/CoV is MSXML and .net support. These technologies are going to be required for support of the NCSoft Unified loader... and NO, Mono is probably NOT the freaking answer to this. Mono technology MIGHT be the answer if merged properly into the Cedega engine.

What irks me most is that these technologies shouldn't even be up for vote, nor support for the NCSoft Unified loader. Yet, not only are these technologies placed in a voting list, substantial number of Cedega users are voting AGAINST these technologies.

-2: 4.93% (18)
-1: 1.37% (5)
0: 74.79% (273)

-2: 7.4% (27)
-1: 2.19% (8)
0: 58.36% (213)

NCSoft Loader
-2: 16.71% (61)
-1: 2.47% (9)
0: 55.89% (204)


So, from a voting point of view, a few people seem dead set to work against improving the Cedega Technology.

Anyways, moving on, other aspects that irk me are the lack of developer reports, and lets face it, what kind of catnip has Wulfie been smoking? I hate to break this to developers and publishers, but I have. Bringing games to the market using DirectX technologies is a Million-dollar mistake developers CAN NOT BE MAKING. Porting games to other platforms that do not use DirectX (e.g. Playstation 3, Nintendo Wii, Linux, Mac) involves having to pay developers to either port the code or re-write the engines. If developers used OpenGL and OpenAL from the start, they can go across ALL platforms, even down to Cell phones and mobile devices like the Playstation Portable without a significant reinvestment into porting.

(I've been over this in this blog before when I took ExtremeTech out to the woodshed)

Cedega is in a unique place to give empirical evidence to publishers and developers that Microsoft's DirectX is a BAD business move for games. I don't see, or hear, them doing that.

I'm also a little grumpy about Cedega's documentation on the Cedega 6.03 release. I'd REALLY love to know what brand of catnip the dev team was on to state that ATi's drivers were broken and Nvidia's were not. Sorry. Nope. Nvidia is the one with the broken drivers.


Now, I'm NOT saying that a simple posting on the main site saying something to the effect of "We are busy working on the EA games for the Macintosh and we need to put Cedega development on Hiatus until we finish" would have fixed the ill-will that has been, and IS being, generated.

(as a small note, the EA games were made available within the past couple of weeks, so a developers crunch to get them out the door makes a bit of sense)

But come on. Take a look at the website. Half the links are still broken on the transgaming site. Cedega sections still link to transgaming sections.

We don't see anything, and that is for one, making me more hesitant to paint Cedega and Transgaming in a good light. If anything, it's making me wonder why I even bothered promoting Cedega in the first place.


Another almost personal problem I have is strange feeling the Cedega devs are content to let me handle the support requests and respond to off the wall votes. For a while my response to technical problems on the Cedega Support requests was faster than any official Cedega rep. I am NOT being paid by Cedega, and it grates on my nerves. Okay, it's just a feeling and I couldn't prove it was intentional... but in the case of Cedega, perception is playing a larger hand with absolutely no significant information coming back from the Cedega developers.