On the surface all of the programs have the same basic function. That function is to implement a software lock on a program.
This software lock prevents the users of that program from carrying out certain actions; and at extreme ends can prevent users from the utilization of the program. Despite these similarities Punkbuster is welcomed in gaming communities, while the other utilities are openly reviled.
The impetus for this thought was a repetition of an oft-used claim I saw on the Sega Forums. Somebody made the comment that the application in question would not ever be on /Linux. This statement was based on that application's usage of NProtect's GameGuard, and the quote/unquote following statement: "which goes against the general use policies of linux."
The reasoning behind the statement took me aback. For starters, there are no such general use policies for /Linux systems. Secondly, I am very familiar with the GameGuard program. It is a competitor to PunkBuster, and as far as I am aware, not a malicious rootkit or a Digital-Rights-reMoval application. In my mind there is a clear difference between useful utilities that prevent players from hacking games, malicious DRM rootkits, and benign DRM Services.
The Lock-Out Implementations
- Anti-Hacking: Punkbuster, GameGuard, Valve.Anti.Cheat
Tools like this prevent the computer user from breaking an application and using that break to affect other players in a networked environment. These tools generally do not prevent modifications to the program itself, the digital-lock is run as a process, and are generally optional to utilize. The person hosting the network-server must enable the anti-hacking tool, and the person launching the client-application must agree to use that anti-hacking tool.
- DRM Malicious Rootkits: SecuROM, Tages, ZDPP
Rootkits like these prevent the user from accessing the application itself, and the digital-lock is generally implemented on a system-wide level rather than a process-level. The result is that these rootkits take away control of the application itself from the user, and can result in permanent system-level damage. Most of these rootkits have limited activations or installations which cannot be renewed or extended, thus forcing the purchaser to repurchase the software if they want to continue to use the software they already purchased.
- DRM Benign Single Sign On Services: Desura, Valve.Steam
Services like these require an internet connection in order to authenticate ownership of the application. Applications are stored in a defined container, but users are largely not restricted from modifying stored applications. The Digital-Lock is often implemented as a process. These services often include additional features such as un-attended installations, automatic-updating, data-file synchronizing, cloud-storage utilities, and other features such as storefronts or library management.
The drawback to Single-Sign-On systems is that they do not address the offline-user. Not to put too fine a point in it, but what was wrong with entering a unique-CD-key code?
The drawback to Single-Sign-On systems is that they do not address the offline-user. Not to put too fine a point in it, but what was wrong with entering a unique-CD-key code?
To me these differences are as clear as the difference between day and night. What is the perspective from somebody who is not as technically inclined as I am? Are these programs really all that different?
What about from a moral or ethical standpoint. Is it ethical to lock down computer software to prevent access or modification? Is that a morally right thing to do? For me the determination comes down to a very specific litmus test:
- Is the lockout going to beneficially affect somebody else's application experience?
- Is the lockout going to negatively affect your personal application experience?
- Is the lockout intended to prevent theft of the application?
Yes, it is morally or ethically correct to lockout software if that lockout prevents a negative experience for somebody else. This would be the anti-hack tools such as PunkBuster and GameGuard. They ensure that people who are playing games in a networked environment are playing in a fair environment. Such lock-outs are already supported within the /Linux software ecosystem. Technologies such as Punkbuster have native IA32 and x86-64 libraries. Strangely, the aforementioned GameGuard does not advertise GNU/Linux support, even though I am led to believe that GameGuard has at least a native x86-64 client available upon request in order to compete with Punkbuster and Valve.Anti.Cheat.
No, it is not morally or ethically correct for lockout software to prevent you from using the software. This would be the malicious rootkits that can destroy your operating system or force you to have to re-purchase a software license. This would also include always on 24/7 dial-home services for non-internet-only games.
It is morally and ethically permissible to implement a software lock to prevent theft. However, in order for this type of lock to be morally and ethically acceptable, the lockout needs to be non-destructive and flexible. Single-Sign-On services are an acceptable compromise that give content generators a level of theft-protection while not threatening the users-computing environment.
The /Linux Perception
With the above concepts in mind, that there are notable differences in software locks, and notable differences in what makes those lockouts acceptable or unacceptable, how do this relate to /Linux? How do we explain to somebody who is unfamiliar with the /Linux software ecosystem that software-lockouts are permissible? How do we explain that there are no such things as usage policies?
The answers to these questions can be complicated. Over the years an extensive amount of Fear, Uncertainty, and Doubt has been generated on the subjects of the /Linux kernel, the GNU Operating System, third party applications, licenses, and many other aspects of the overall /Linux software ecosystem. A very recent case in point is the Free Software Foundation call-out on Canonical over the usage of private keys and Grub2.
Many computer users seem to be under the impression that proprietary programs cannot be run on /Linux systems, or that technologies that implement a software-lock cannot be run on /Linux due to some non-existent policy. Most think this either due to the repetition of F.U.D. from sources such as Microsoft, or just general confusion from lack of education. Before going further it would probably be a good idea to just clarify the relationships between the Kernel, Operating System, and Applications. To do that I'll use some breakdowns for Android/Linux and for an Embedded GNU/Linux:
In these pictures we can clearly see the drill down of the components working with each other. Applications talk to the API's and Libraries in the Operating System. It is those API's and Libraries in the Operating system itself that turn around and talk to the hardware devices exposed by the kernel. Incidentally, this layout is why you have *updates* for drivers, libraries, and API's. Errors or inefficiencies of code in these components can affect the entire operating system due to their low-system-level.
This breakdown is also explains why applications compiled for Android+Chromium are not necessarily compatible with applications compiled for GNU and vice-versa. While the underlying kernel itself may be the same, applications generally talk to the API's and Libraries in the Operating System rather than the kernel itself. This is the Important Bit to remember.
Applications that run on GNU/Linux operating systems generally access the GNU libraries, which are released under the Lesser GNU Public License. Let me quote something from 2004 written by the FSF: http://www.gnu.org/licenses/lgpl-java.html
FSF's position has remained constant throughout: the LGPL works as intended with all known programming languages, including Java. Applications which link to LGPL libraries need not be released under the LGPL. Applications need only follow the requirements in section 6 of the LGPL: allow new versions of the library to be linked with the application; and allow reverse engineering to debug this.
Note two of the salient, e.g. bolded, points made by the Free Software Foundation. Any and all applications can access the GNU libraries, regardless of license, financial cost, or any other factors. The only restrictions is that the author of that application cannot restrict reverse engineering of their product for the purpose of debugging library updates. Again this was written in 2004 and some elements of the LGPL have been updated or clarified in the LGPL Version 3.
There is another restriction to the LGPL, and it is one Google brings up here: http://source.android.com/source/licenses.html,
LGPL (in simplified terms) requires either: shipping of source to the application; a written offer for source; or linking the LGPL-ed library dynamically and allowing users to manually upgrade or replace the library.
Many users and developers get hung up on the concept of OR, and for some reason or another believe that using the LGPL GNU Libraries requires releasing the source code that calls upon those libraries. There are some valid concerns here for some vendors since dynamic software linking can be an issue on embedded platforms such as cellphones and Tablets. In such constrained software environments the operating system is distributed as a static-image. Historically most constrained-computing devices; which for point's sake is defined as almost every single electronic device with an embedded operating system; are never updated after they are released. This is one of the reasons phone vendors such as AT&T struggle to get Android Operating System updates out in something that does not resemble a geological time scale. AT&T has not yet adjusted to users not only wanting, but demanding and expecting Operating System updates on an embedded device as part of the service plan.
In terms of desktop usage, this is not really a problem. Although many users and developers might be unfamiliar with the /Linux software ecosystem, they should be familiar with the Microsoft Windows distribution method. Microsoft generally presses out a static disc image for their Windows Operating System, and it is this image that is distributed to end users and vendors. The end-users and vendors are responsible for ensuring that the static-image that was distributed is then updated with the latest sets of software patches. This is a very normal operating procedure for users of desktop computers.
Most GNU/Linux work in much the same way. The user installs the operating system, then pulls updates down for the operating system. Developers writing for GNU/Linux systems thus have to determine whether or not they want to statically-link their library files for distribution, or dynamically link the library files and simply use those provided by the operating system. Dynamically linked applications are generally preferred since users can have a wide range of GNU libraries in use and dynamic linking is the only sane way to distribute applications.
To reiterate, the dynamic linking restriction in place of source code does not, in any way shape or form, concept or idea, particle or boson, prevent the distribution of a proprietary application with an attached financial cost from using the GNU Libraries.
What about the Kernel?
So then, if there is nothing to prevent proprietary programs from running natively on /Linux systems, what about the Linux Foundation's call out of Nvidia for it's proprietary drivers in 2008?
Again, two separate things here. The /Linux kernel is just that, a KERNEL. The /Linux kernel is not an operating system. For the /Linux kernel proprietary drivers are a nightmare, which is why there are only two real proprietary drivers of note: Nvidia-GLX and AMD Fglrx. Linux kernel development occurs almost too fast to really support an out-of-tree driver API. This rapid pace of technology is one of the reasons AMD has said they'll be opening up Catalyst for the HSA Foundation (slide 30).
It is important to separate the Kernel from the Operating System. Yes, the /Linux Kernel developers have a very vocal policy against software lockouts and proprietary licenses. The policy of the /Linux Kernel developers only applies to the /Linux kernel, not to the Operating System. Case in point, the Android+Chromium operating system(s) also use the /Linux kernel, but because they are not widely associated with GNU/Linux and the associated "viral" GNU Public Licenses, they do not suffer from the perception that there is a policy against software lockouts or proprietary licensed software.
Distributing Digital Rights reMoval software
In theory then, could malicious rootkits under proprietary license like SecuROM or Tages be brought to the GNU/Linux platform? Assuming that the native-client applications were dynamically linked to the GNU Libraries, then yes.
Could those applications be distributed? If the license allows for the unencumbered redistribution of the application, then yes.
Would those programs be distributed? This is the better question to ask. One of the signature problems of commercial /Linux support is actually getting applications into the hands of downstream users. Many Windows users may be unfamiliar with general GNU/Linux distribution methods, but are probably familiar now with digital distribution through applications like Valve.Steam, Google Play, Itunes App Store, or the Amazon App store. These digital distribution stores are largely modeled after networked software storage systems developed for GNU/Linux known as package repositories. Valve.Steam, for example, is often referred to as "Apt for Windows" given the multiple similarities to the Debian Apt system.
Many of the applications released into the /Linux software ecosystem are released under open-source licenses with unencumbered distributions. This allows the programs to leverage the networked system package repositories for storage and distribution. Programs with proprietary licenses can still be distributed through package repositories. Case in point, Debian designates proprietary licensed applications as non-free and makes them available, although as a separate option from the main distribution.
There is a difference between an application that can be added to a repository, and one that will be added. Most package repositories tend to be guarded with multiple levels of security. For example, becoming a Debian Maintainer requires jumping through lots of hoops including physically meeting with another maintainer. Adding a deliberately malicious package would destroy the trust the downstream users have with the maintainers of the repository. Offhand, I think this might be where the concept that a universal policy against software lockouts came from.
There is a drastic difference between a Repository Maintainer protecting downstream users from a malicious application, and a policy against software lockouts. One does not beget the other.
Where do we go from here?
The /Linux software ecosystem continues to grow across both GNU/Linux and Android+Chromium/Linux. Commercial vendors who have long since ignored the /Linux software ecosystem are slowly being forced into adopting platform neutral development techniques. In all fairness the platform neutral approach has also been helped by deliberate breaks in Microsoft's Windows Operating systems. For many developers the only way to target Windows Xp, Windows 7, and Windows 8 for application deployment is to adopt a platform-neutral development strategy such adopting graphics technologies like OpenGL over DirectX.
From my perspective the turn-about has been both hilarious and painful to watch. Companies like Valve and Unigine tend to approach /Linux, and for that matter platform-neutral development and distribution as a market reality rather than a one-off experiment. Companies like Electronics Arts tend to approach /Linux development and distribution as an experiment, something that can be abandoned if things do not go completely right. Companies like Activision will happily use /Linux for servers, but have no idea what to do with the desktop /Linux market other than ban players from Diablo III who were not using Windows.
With non-native repository solutions such as Valve.Steam and Desura vendors now have an external solution to distribute their native-client protected applications to downstream users within the /Linux Software ecosystem. Does this mean that we will see the rise of malicious software distribution through Valve.Steam or Desura?
My guess is an "unlikely no." There are multiple reasons for this, starting with the simple fact that most malicious software lock-outs never really worked to begin with. Anti-consumers such as pirates were not halted by malicious roots such as SecuROM, Tages, or ZDPP. The malicious rootkits only impacted legitimate users.
Then there is the network-connection question. Some of the vendors I've talked with over the years admitted that they shipped a malicious rootkit instead of a Single-Sign-On service for the sole reason that they wanted to prevent application theft from an offline user. The computing market has changed greatly in the past several years as Internet Access has become almost ubiquitous. While it might be possible that there are still Windows users who are buying modern-day application packages with no intention of ever connecting to the Internet, I think it would be a bit of a stretch to find a /Linux user with disposable income looking to buy a modern-day application package with no internet access. I think application vendors could probably be assured that solely distributing their applications through a Single-Sign-On service on GNU/Linux such as Valve.Steam or Desura would not limit or hamper potential sale opportunities.
The resistance to pushing commercial released consumer applications into the /Linux software ecosystem is not going to go away overnight. Vendors and consumers need to be educated on what the Open-Source licenses really say, and years of Reaper Indoctrination say Shepard is alive, I mean, years of Microsoft's F.U.D. flinging are going to be difficult to counter. We will continue to see vendors decline to release their applications into the GNU/Linux ecosystem due to concerns over licensing, library linking, or imagined policies, regardless of what the facts actually are.
1 comment:
Hi, Nice post thanks for sharing. Would you please consider adding a link to my website on your page. Please email me back. Thanks!
Aaron Grey
aarongrey112@gmail.com
Post a Comment