Sections: Main page Linux in the news Security Kernel Distributions Development Commerce Announcements Back page All in one big page See also: last week's Back page page. |
Linux links of the weekLinux Resources, a site run by the folks who do the Linux Journal, has been recently reworked and has a snazzier look. They also appear to be working to increase the amount of original content there. Linuxports.com is dedicated to commercial ventures with Linux in general. More specifically, it is the home for the Linux Consultants, Commercial, and VAR HOWTO's. The site has been recently reworked with an easy submission mechanism for those wishing to be listed in the appropriate HOWTO. Section Editor: Jon Corbet |
May 27, 1999 |
|
Letters to the editorLetters to the editor should be sent to letters@lwn.net. Preference will be given to letters which are short, to the point, and well written. If you want your email address "anti-spammed" in some way please be sure to let us know. We do not have a policy against anonymous letters, but we will be reluctant to include them. | |
Date: Sun, 23 May 1999 22:49:08 +0200 (MET DST) From: Tomasz Motylewski <motyl@stan.chemie.unibas.ch> To: lwn@lwn.net Subject: The Free Software Basaar On the page: http://lwn.net/ you have written about The SourceXchange and Cosource.com But I feel that you should have mentioned an already working institution of this type: The Free Software Bazaar http://visar.csustan.edu/bazaar/ I have been envolved in one of its projects, and I must say it was great. Best regards, -- Tomasz Motylewski | ||
From: schwarzma@healthpartners.com (Michael Schwarz) Subject: PGP correction To: editor@lwn.net Date: Mon, 24 May 1999 14:15:22 -0500 (CDT) I wrote a letter that was published in last week's LWN. A number of people wrote me and LWN to correct what I stated. While I was correct that PGP uses an RSA (of anywhere from 512 to 4096 bits) public/private keypair to encrypt a 128-bit IDEA session key, I was dead wrong that an attacker would concentrate on breaking the 128-bit key. Why? Two reasons. 1) A 1024-bit RSA key is much easier to crack than a 128-bit IDEA key. Why? Because the attack on the RSA key involves trying pairs of primes. The size of this problem is is *smaller* than the problem of trying every 128-bit key. A nice summary of the issues can be found at: http://axion.physics.ubc.ca/pgp-attack.html I've let this be a lesson to me. Don't think because you know a little that you know it all! 2) The second reason is that recovery of the public/private key pair gives you not just the one message, but every message encrypted with that public/private key pair. Obviously, this is the holy grail. For the record, a 128-bit key has 340,282,366,920,938,463,463,374,607,431,768,211,456 possible values. My comment about "decades" was modest. Further, cracking IDEA isn't helped by TWINKLE. TWINKLE speeds up the factoring of primes, not part of the problem in cracking IDEA. My thanks to the several people who e-mailed me to correct my errors. I just wanted to set the public record straight myself and direct those people with questions (like me!) the above URL which summarizes the issues neatly. -- Michael A. Schwarz | "If God had meant for man to msNOchwarz@sSPAMherbtel.net | walk, he would not have invented | roller-skates" - Roald Dahl -------------------------------------------------------------------------- | ||
From: Matthew Benjamin <MBenjamin@comshare.com> To: "'editor@lwn.net'" <editor@lwn.net> Subject: KDE Wars Again? Date: Tue, 25 May 1999 11:17:53 -0400 Miguel de Icaza's remarks about KDE were unfortunate. GNOME has made great progress over the past year--moving, to be blunt, from a promising kit Nick Pretreley couldn't get to work on his machine (around February, 1999) to an environment he prefers (May, 1999). From where I sit, though: 1. KDE has a very bright future. It is the default desktop for, IIRC, at least 5 packaged Linux distributions and the Corel Netwinder. The appearance that Miquel is not aware of this does not add to his credibility as an OSS guru. 2. From a software engineering perspective, QT/KDE are very well designed. I do see GTK's easy binding to many languages as a major strength--the best purely technical reason to use it, in fact. But it is not one that take away from QT/KDE--far from it. Miguel's apparent belief that implementing in C connotes software quality--or even makes up for poor or uneven implementation quality--does not add to his credibility as a software designer. GTK/GNOME's greater flexibility is one of _it's_ strengths, but is not thereby a _weakness_ of QT/KDE. 3. From a user-interface design perspective, KDE is very well done. It merges ideas from many desktop enironments into a seamless whole that is very ergonomic and effective. At least one reviewer has said he prefers it to the Macintosh. The QT/KDE toolkit makes stable and visually consistent applications very easy for novices to create. Since KDE allows--but does not require--a very Windows like UI style, the attractiveness of Linux/UNIX to current Windows users is greatly enhanced--no small advantage to the entire OSS enterprise, in my view. 4. QT/KDE has been declared free by the maintainers of the Open Source Definition, and OSI. The KDE framework itself is fully GPL'd and LGPL'd. No one is helped when the leader of one Open Source project lets himself be quoted saying his Open Source competitors "aren't really free." (I believe that this violates a basic rule of Open Source etiquette, though I am not an OSS anthropologist, and cannot make proclamations like this.) 5. The QT/KDE team has shown great leadership. I think any fair reading of history gives them credit, at the LEAST, with showing that a new, from-scratch, world-class, UNIX user interface could be done at this late date, and, most importantly, that it could be done as Open Source software. No one can take that away from them--and I think it is unseemly to try. 6. I don't think that KDE developers engage in this kind of trash talk about GNOME developers, quite the opposite in fact. What motivates this behavior? I'm sorry, I don't understand. GNOME spokespeople should focus on developing and documenting the strengths of their own approach (which are many), and should be generous to their KDE competitors. Matt Benjamin | ||
Date: Thu, 20 May 1999 11:36:52 -0400 From: Walt Smith <waltech@bcpl.net> To: letters@lwn.net Subject: the mindcraft challenge Hi all, I'm an occassional Linux user, not a developer. I have gradually become educated in Linux and have installed several systems and configured several server/application tasks. I've also done the same with Windows. I agree that Linux may not be quite truly ready for the desktop (at this time) and makes a dynamite server. That being said for perspective...... I like Linux as an alternative to MS for many tasks and use both (win95). Today, I read the MS/Mindcraft challenge linked by lwn.net. It reads like a Clinton/Milosovic pamphlet. (sorry- with the Kosovo thing, and having read the Clinton transcripts, it seems appropriate). No matter the validity of a retest, the "results" posted by Mindcraft will be way out of proportion. Frankly, the way the challenge is written (along with the comparison list of the previous test), it appears the audience is a 3rd world country - or those souls who are extremely limited in use of the OS's. Possibly housewives or gardners who have zero interest in such matters? (corporate managers?) It looks to me to be written by a plain huckster. There is a line between good solid American salesmanship (with normal exuberance) and hucksterism. While there was much I take exception to, I cannot factually object on many technical items because of my lack of direct experience. However, the statement that "Linux" is slow to respond to the challenge is something that I can't let go. "Linux" did respond by instantly rejecting the results of the test, asking for a another test, and stating the conditions, which sound quite reasonable to me. Simply because a date wasn't instantly agreed to - (did Mindcraft propose a date?) doesn't mean that "Linux" (implied- Linux Community) is slow. It means the challenge was issued to no one in particular at no particular time. LWN is correct - it's a trap; but an obvious one with pure, biased, self-serving marketing propaganda and attending publicity as the objective. Marketing does work, but in a free society such as ours, really bad tasting soup that sells during the first few weeks it's advertised eventually has no more buyers. Untruthful unadulterated propaganda has a habit of backfiring. regards, Walt Smith, Baltimore | ||
Date: Tue, 25 May 1999 18:08:52 +0100 From: Aaron.Trevena@msasglobal.com Subject: more flaws in NT v Linux pieces To: thurrott@wugnet.com Paul, Both PC week and PC magazine are more used to NT as they are from a PC/Home environment and don't really have the experience in servers that say Byte or Performance Computing have. The reporting style alone is as poor as the glossy ComputerAd's magazines, it is hardly in the same league as professional Journals. This is shown even more clearly by a total lack of understanding when implementing the dynamic content benchmarks. Comparing threaded server extensions like ISAPI or NSAPI are totally different to CGI. Linux and Unix have a variety of Servers but Zeus and thttp the renowned fastest web servers were not included in the test, while Apache have always made it clear that the aim is - sufficient speed to do the job well while providing reliability and extendability that IIS and other commercial servers cannot offer. Zeus provides ISAPI support as well as a huge speed increase over Apache, yet this isn't even mentioned. Not only were applications and servers missed out but even the most obvious unix's. SGI's IRIX is known to outperform NT using SAMBA, but wasn't included. Net/Open/FreeBSD the 'other' free unix (with original UNIX heritage) is not mentioned and neither is BSDI the high end commercial BSD unix designed exactly for networking and webserving. The e-commerce tests were a joke comparing completely different techniques and systems. PHP, Zope, Chillisoft, EJB, oracle, db2 none of these were included in the tests but these are what professional application developers use. Mod_perl - the Apache perl module that provides high speed perl cgi was not included nor velocis its commercial cousin. The tests were poorly researched and ran for only 4 hours, Web uptime for UNIX is measured in hundreds of days so 4 hours is of very little value - what happens when arcserve on NT crashes and you are given the choice of rebooting NT or risking no backups - I have seen it happen where I work. It would have been useful to see how well the machines were doing after 45 days, or 100 days with that consistant load. The problem with journalists familiar with windows is that they don't know enough about UNIX or open source to do the right research (if at all), and Linux and OSS advocates have to point out the obvious to them. But then the readership of these magazines as well as the advertisers all of whom have a lot riding on NT want to hear how good it is and how they made the right choice. With gaping holes and skewed facts that rather then being reported objectively by professionals, are crowed about when the magazines prefered vendor does well and whispered when they don't (see how it isn't mentioned outside of the numbers themselves how Solaris outperforms NT, or how SAMBA beats NT when serving NT clients in comparison to headlines screaming that NT is faster then Linux when in fact IIS on NT serves some types of webpages faster than Apache on Linux depending if you have expensive enough hardware and run different types of test - ISAPI v CGI) it is hardly surprising when we kick up a storm about it. Aaron Trevena. Intra/Internet Developer & System Administrator (AIX,NT,LINUX) nb: your reply would be much appreciated, this has been cc:ed to Linux Weekly News. | ||
Date: Fri, 21 May 1999 12:59:20 +0100 From: Charlie Stross <charlie@antipope.org> To: letters@lwn.net Subject: On copyright, free software, and being Restrictively Unrestrictive There's something of a row going on at present over the ideological or political trappings of the FSF, and specifically the GPL. Various people have been throwing accusations around ("Richard Stallman is a communist", for example). Others are saying that the GPL is restrictive and is an attack on non-open-soure software. I think these people are completely missing the central point. The free software movement is like the little boy standing by the parade, pointing at the Emperor, and shouting "but he isn't wearing anything!" The emperor in question is, of course, our current notion of intellectual property. Let's go and take a peek through the wonderful cinemascope time-viewer, and replay some interesting bits of history, Back before the Gutenberg revolution, if you'd suggested the concept of copyright to anyone who was literate they'd probably have stared at you as if you were mad. Copying information was a highly labour-intensive operation: a mass market for duplicated texts simply didn't -- and couldn't -- exist. Patents -- or their forerunners -- existed, in the form of royal grants to some individual or guild to have exclusive ownership of some tool or mechanism for production, and the guilds had their secrets, but the legal basis for ownership of trade secrets was different from the basis we understand today: you owned one because the King said he'd hang anybody else who muscled in on your turf (as long as you behaved yourself and paid your taxes). The contemporary explanation of patent rights would be incomprehensible, because the concept of a society based on a social contract and mutual observation of rights didn't exist: there was no mechanism whereby society (or its legislators) could agree to grant rights to inventors in order to encourage their creativity. Let's hit the fast-forward button a bit, and take the leap into the age of enlightenment -- post-printing-press, post-monarchical. Duplicating texts had become a problem by the nineteenth century. Earlier solutions included licensing printing presses, but in a society that encourages free speech there's no obvious justification for that. A situation arose where any aspiring novelist who published a book would be vulnerable to unscrupulous printers copying their work and re-selling it, pocketing the profits that accrued. Mass literacy brought its own new social problems. The solution to this problem was the idea of copyright; that the author of a work had the power to grant a right of copying over it. A sensible and moderate solution within the context of the time, because printing presses were big and pirate printers could be tracked down and sued in civil court. A similar approach was taken to inventions; it was merely common sense that an inventor who came up with a genuinely new innovation should have the right to reap some profit from it before carpetbagging imitators duplicated the idea and swamped the market. Patents originally were a sign of progress; by protecting inventions they made it feasible to publish details of them, rather than trying to maintain the secrecy surrounding them. This in turn encouraged a climate of invention. Secrecy, as we should all know, is one of the enemies of progress. And now let's hit that fast-forward button again and jump all the way to the present day. The concept of copyright has been over-extended. From protecting an individual author's rights to their work, it has been extended to protect vast corporations. From covering published books and pamphlets that some individual slaved over, it now covers what a Marxist economist would call alienated labour -- the capital accumulation of information. By extending copyright seventy years after the author's death our legislators haven't done anything for their surviving families, but have taken a large chunk of our common cultural heritage and handed it over to faceless corporations who can dole it out on a commercial basis. By extending copyright cover to music, the legislators have granted new rights: the music industry in turn is concerned with constructively extending their copyright in such a way that the consumers pay per performance, rather than paying a one-off purchase fee related to the recording medium. And so on. The patent laws have also been shown to be defective. Software patents run for the same 20-year period as normal patents: but in the febrile world of software, 20 years covers as many generations as 75 years in the automobile industry or 250 years in the construction industry. Meanwhile, patent agency staff who are manifestly untrained for the task grant patents on inappropriate inventions and things which simply are _not_ inventions, such as the algorithms underlying public-key encryption. By granting patents on mathematical principles, they are hampering the growth of the industry rather than fostering it; it's as if they had allowed some company to patent the refractive index of glass and claim royalties from any other company producing materials that shared that physical characteristic. And so, we come to the free software movement: loudly declaring "but your whole idea of copyrights and patents and selling something that can be copied freely is a load of crap! Charge for support and services, make the software itself free, and you won't have to deal with these internal contradictions!" Well, time will tell. Personally, I think the answer is a thorough overhaul of copyright and patent laws, drafted not from the point of view of the big multinationals (who want to be able to copyright database schemas and patent mathematical theorems if it helps them make more profits) but from the point of view of the original agreed social goals -- to protect the writers (and programmers, and musicians) from plagiarism, and to encourage the inventors to keep inventing and raising our standard of living. -- Charlie Stross (Linux columnist, Computer Shopper (UK)) | ||
|