[LWN Logo]
[LWN.net]

Sections:
 Main page
 Linux in the news
 Security
 Kernel
 Distributions
 Ports
 Development
 Commerce
 Announcements
 Back page
All in one big page

See also: last week's Back page page.

Linux links of the week


The Linux in schools project is working to place Linux machines in K-12 school situations. They include a fair amount of introductory and howto information, and host a mailing list for the project. This is a worthwhile project; many U.S. schools now are Macintosh based, but it's not clear if Apple will hold in that environment. If the Macs have to go, let's have a good alternative ready to replace them.

One of the very best mailing lists on the net is Phil Agre's Red Rock Eater News Service. Subscribers get 5-10 messages per week, most of which are highly interesting. See the archive site for a sample of the sorts of things that go out. One recent posting was this message about Halloween II.


November 12, 1998

   

 

Letters to the editor


Letters to the editor should be sent to editor@lwn.net. Preference will be given to letters which are short, to the point, and well written. If you want your email address "anti-spammed" in some way please be sure to let us know. We do not have a policy against anonymous letters, but we will be reluctant to include them.
 
   
To: editor@lwn.net
Cc: "Michael K. Johnson" <johnsonm@redhat.com>
From: "Michael K. Johnson" <johnsonm@redhat.com>
Subject: Your review of Red Hat Linux 5.2
Date: Sat, 07 Nov 1998 09:22:17 -0500


Thank you for your generally good review of Red Hat Linux 5.2.  I
would like to clarify a few points, however.

The first is that it is possible to keep Xconfigurator from probing
and thus hanging low-quality video hardware.  It is not necessary to
switch to the second virtual console and kill Xconfigurator.  The
installation manual covers expert mode, and says, "Expert mode
disables most hardware probing..."  In expert mode, Xconfigurator asks
whether you wish to probe.  Furthermore, Xconfigurator does not probe
hardware that we know causes the computer to hang, even when you are
not running in expert mode.  We have not experienced this problem with
the Virge cards we have in our testing lab.  Obviously, you have a
different Virge card than any of the many that we have, even different
from our Virge VX card.

The second is, in regards to gnome-libs, "One wonders why they left
the older stuff in the main distribution."  Actually, there's no
mystery here.  We use the stable Gtk+ 1.0 libraries in the
distribution, and whereas GNOME 0.20 is built against the Gtk+ 1.0
libraries, GNOME 0.30 is built against the unstable, development Gtk+
1.1 libraries.  This precludes us using GNOME 0.30 libraries in the
distribution proper.

Using the older gnome-libs is not a problem, because the only use of
gnome-libs within the distribution is for the gnome-linuxconf
interface to linuxconf.  It does not need any of the new features of
the GNOME 0.30 libraries.

Thanks again, and thank you for the opportunity to clarify these
points,

michaelkjohnson

"Magazines all too frequently lead to books and should be regarded by the
 prudent as the heavy petting of literature."            -- Fran Lebowitz
 Linux Application Development       http://www.redhat.com/~johnsonm/lad/

   
Date: Wed, 11 Nov 1998 13:25:49 -0800
From: Michael McAleese <mcaleese@home.com>
To: editor@lwn.net
Subject: Whither Linux?

The point has been raised that Linux kernel development has been
"following taillights", that is, Linux has had a vision to follow
rather than having to innovate.  This may be true to a great extent,
but it doesn't imply that it has to remain that way.

Perhaps a coordinated effort can be made to look to the future, some
sort of Linux kernel research project.  Something along the lines of a
web site where research topics are proposed and papers submitted for
review and discussion, with promising areas being targetted for actual
code development with the current stable kernel.

Once proof of concept has been demonstrated and Linus et. al.
convinced of the utility of the idea, work could be passed on to the
linux community to produce working code for integration with the
development kernel.  The research site would not be the place to
actually work on such projects, it would concentrate on visionary
ideas.

Sort of the Linux version of the Xerox PARC...
   
From: "Gabrielson, Anthony" <AnthonyG@comversens.com>
To: "'editor@lwn.net'" <editor@lwn.net>
Subject: Microsoft
Date: Fri, 6 Nov 1998 14:26:28 -0500 

Dear LWN,

	Many people in the Linux community are bantering on how
Microsoft will be setting aim against Linux.  Myself personally do not
see where or how this is a big deal.  Linux users are different from
Microsoft Users.  Linux users use Linux because it works.  When and if
Linux does not work they can get to the heart of the problem one of
three ways they can hack around in the code themselves, they can pay
someone else to hack around in the code for them, or they can now
(with Linux begining to be adopted by comercial companies) find a
commercial product that does what we need it to do.  Microsoft is not
planning to release the source to their entire OS product line, I
doubt they ever will.  Microsoft is not planning on charging a
reasonable price, something is going to have to drastically change
before they do.  As long as Linux developers don't take an anti
Microsoft stance, but a well that's cool Linux can do that better
stance.  Linux won't be going anywhere soon, as long the developers
keep their pride about doing a good job.  The minute its not personal
for them to do a good job, Linux is in trouble.

Thats just how I see it, maybe I'm wrong - maybe I'm right.  Time will
tell.

Thanks for your time,
Anthony
   
To: editor@lwn.net
Date: Tue, 10 Nov 1998 16:54:00 -0800
From: Jim Dennis <jimd@starshine.org>
Subject: To: Michael Dell   From: "vocal hundreds"


  It's  amusing that,  a few  months after  my  open  letter to Dell
  Computing we hear that Mr. Dell will at least acknowlege Linux.


   I've copied this message to every e-mail address listed on the
  "Contact Dell" web page at:

	http://www.dell.com/feedback/index.htm

   ... simply because their site doesn't provide a single
  "feedback@" or "General Feedback and Requests" page.

  It's quite   likely that my    earlier   open letter  (copied   to
  "webmaster@dell.com"  for lack of  a better address) never reached
  Mr.  Dell, or anyone of   any importance.  That would explain  the
  utter lack of response that I get from Dell Inc.

  I did have a couple of Dell shareholders drop me a line to tell me
  that they  were Linux users, and that  they also wished  to convey
  their support of  my message.  O.K.  so  their just  "little guys"
  and not members  of your board  of directors.  Perhaps  you should
  have a  channel  for investore  relations, so  you can  get  their
  feedback --- or you should let your shareholders in on the secret.

  So  you   get  "hundreds" of    requests  for Linux,   rather than
  "thousands."  You conclude  from this that  we are a "highly vocal
  group of users but not necessarily very large" group.

  Has it occurred to you that a couple of percent of the Linux using
  populace  bothers to  speak up?    Perhaps  many  of your   former
  customers are going to one  of the fifty or  so "little guys" that
  do offer Linux  support.  (One list  of the  these upstarts  is at
  http://www.linux.org/vendors/systems.html).  Perhaps most  of your
  Linux using customers simply sigh and buy.

  Could it be that you're only hearing from the vocal *minority*?

  You've already seen that some  of your corporate customers will go
  through the additional hassles to get it *their way*.  Presumably,
  if you offered the option, you'd find that  many more would select
  alternatives if they were offered without the hassles.

  You may have  heard  that Mexico  is planning  to deploy Linux  to
  140,000  school sites  (with  one   server  and  about  five Linux
  workstations at each).  That's about  1 million machines.  Too bad
  Dell wasn't ready with a low-cost,  low powered Linux solution for
  them.

  Of course  Linux  might  have a   downside  for Dell.   It doesn't
  require much hardware to  run (a Pentium 120  with 32Mb is  plenty
  for a  Linux  workstation  or  server).  Also your Linux customers
  have  broken  out  of  the   "forced  upgrade"  cycle  imposed  by
  Microsoft.  My  decade-old 386DX33 can  run all of  the same Linux
  software as my Pentium II.  

  However, ultimately the market will decide. The market for systems 
  with standard and compatible parts put you in your current  place.
  The market for systems  that run  standards-based "open" operating 
  could very well keep you there, or put some other company in  your
  place.

--
Jim Dennis  (800) 938-4078		consulting@starshine.org
Proprietor, Starshine Technical Services:  http://www.starshine.org
   
Date: Mon, 9 Nov 1998 21:33:40 +0100
From: David Kastrup <dak@neuroinformatik.ruhr-uni-bochum.de>
To: editor@lwn.net
Subject: Future of Linux


Now that Linux is making visible inroads into commercial market areas,
a lot of excitement is around.  We seem to be finally making some
steps towards the goal "world domination, soon".

I will point out some milestones on the way to there.

a) Linux is considered of having reached critical mass for being no
   longer irrelevant to business decisions.  We are getting there at
   the moment with regard to media coverage and market attention.

b) various parties will want to have Linux work for their own goals,
   and will support Linux developers to this end, mostly with lent or
   donated hardware and specs.  we have been seeing a lot more of this
   lately, as the involved costs are negligible, and the impact on the
   growing Linux market (and probably outside of the Linux market as
   well) pretty large.  Examples involve Adaptec now trying to help
   out with driver development and specs, Sun (which has given out
   some Ultrasparcs to Linux developers), the I2O consortium (that has
   made the specs available in the hope that Linux development will
   provide them with reference implementations), and so on.

   Few parties have from the early beginnings of Linux done their part
   to get their hardware recognized as a good Linux player, such as
   Digital (which have had a history of supporting Alpha Linux from
   the start).  Pretty few vendors have even made this sort of action
   official by signing up with Debian's Open Hardware Certification
   Program (http://www.openhardware.org), but the interest has been
   pretty low-profile up to now.

   This support will in the main make life for Linux developers
   easier, as well as offer them more choice of what to develop for.
   It will also cause more work to pile up than anticipated, and one
   does not want to disappoint the goodwill of the involved parties.
   Little harm with regard to influencing Linux policy directions can
   come from it, as the developers making the decisions about what to
   do and when remain the same.

c) Some business players find the current flavors of Linux do not fit
   their bill appropriately, and will start their own development and
   distribution efforts.  Well, this has actually been more or less
   the driving force of every distribution up to now.  Still, some are
   more notable than others in that respect, such as Caldera, which
   pushes its own variants of commercial Netware-aware Linux versions.

   Interoperability of products of different vendors will increasingly
   become difficult.  For damage control, rigid standards will have to
   be agreed on.  In particular, the entire desktop environment will
   have to be standardized, as well.  By this I do not mean that a
   decision on GNOME/KDE/whatever will necessarily have to be made,
   but that the user is free to make his choice without sacrificing a
   pool of software.  That is, the appropriate protocols need to be
   defined and standardized in a way to make all applications run and
   interact appropriately on all desktop environments.

   If this sort of standardiziation does not set in, the dynamics have
   the potential of pretty seriously harming Linux.  If Linux is
   considered a popular market factor, we will get a bunch of
   properietarily enhanced Linux versions.  We will get Sun Linux
   (including options for proprietary compilers and proprietary
   high-performance NFS servers), SGI Linux (including
   high-performance proprietary OpenGL software and servers), yes,
   even Microsoft Linux (which is able to run Microsoft Office for
   Linux, given that you have installed the proprietary desktop
   servers).  Microsoft Linux will cost about half of what Win98 costs
   now, will run much more stably, will cost Microsoft about the
   twentieth part of development costs that Win2000 does, but will
   remain a parallel product line at first because having a serious
   number of compatibility issues.  It will probably come with
   Microsoft's equivalent of Wine.  The resulting bloat from always
   having to run this emulator in between will be less than what
   people have come to expect of NT, though.

   In order to not have to diversify too much, developers will use the
   development tools of the widely accepted Microsoft Linux
   distribution, which will result in applications running smoothly
   under Microsoft Linux.  Support for other vendors will eventually
   dwindle, except by some geeks not wanting to run Microsoft
   applications on a reasonably stable system.

World domination, finally.

You think this absurd?  Even now, under players mostly with the same
goal, (as seen with http://www.linuxbase.org), standardization efforts
form tiresome and arduous work, meaning that even now some Linux
vendors do not have the resources to participate (Slackware).  When we
have parties involved that do not like to talk with one another (like
GNOME/KDE), or openly destructive parties (as to be expected with the
equivalent with Microsoft Linux), things are not going to improve.

If corporations see Linux as a market factor, resources will be thrown
at it.  If we do not have a firm commitment to open standards in place
before this happens on large scales, we will get closed and/or de
facto standards.  Even if we get open standards, they might be as
complicated as to have only the big players have enough flexibility to
implement them.  As an example, see how C++ has hobbled free compiler
development: the incredibly complicated language definition has caused
gcc development to freeze.  The FSF's non-commercial development
infrastructure of the gcc compiler for the comparatively simple C
language could not keep up with the complications of the C++ language.
This has resulted in the splitoff of egcs, mostly managed by Cygnus, a
commercial entity and large-time contributor.

And these are things that occur in situations where the involved
parties are doing their best to cooperate and further free software
development.

I am glad to see that the current major players in the Linux market
seem to have mostly good intentions.  They will not remain the only
players, though.  And I certainly hope that the rules of the game will
have been firmly established before the real brutes enter the
playing-field.

While my above scenario has centered on one potential player for
illustration purposes, actually the entry of any large players with
definite interests into the Linux market could cause similar problems.
If Linux is to keep its diversity and prosper with it, it will have to
have standards.  Not "standards" established by killing off
competition, but by seriously working on interoperatibility between
different players.  These standards will have to be minimal in order
not to stifle potential developers, but sufficient in order to meet
their purpose.

David Kastrup                                     Phone: +49-234-700-5570
Email: dak@neuroinformatik.ruhr-uni-bochum.de       Fax: +49-234-709-4209
Institut für Neuroinformatik, Universitätsstr. 150, 44780 Bochum, Germany
   
 

 

 
Eklektix, Inc. Linux powered! Copyright © 1998 Eklektix, Inc., all rights reserved
Linux ® is a registered trademark of Linus Torvalds