delorie.com/archives/browse.cgi   search  
Mail Archives: opendos/1997/02/05/20:51:00

From: mharris AT blackwidow DOT saultc DOT on DOT ca
Date: Wed, 5 Feb 1997 18:30:32 -0500 (EST)
Reply-To: mharris AT blackwidow DOT saultc DOT on DOT ca
To: OpenDOS discussion list <opendos AT mail DOT tacoma DOT net>,
opendos-developer AT mail DOT tacoma DOT net
Subject: [opendos] OS advancements and old technology: My viewpoint.
Message-ID: <Pine.LNX.3.95.970205144009.4560H-100000@capslock.com>
Organization: Total disorganization.
MIME-Version: 1.0
Sender: owner-opendos AT mail DOT tacoma DOT net

There has been a lot of discussion lately on the list about
improvements in OpenDOS, and software in general breaking on
older machines.  Although this is sometimes the case, it is not
always, and in our recent discussions concerning COMMAND.COM,
etc, it is NOT the case.  

I just read a few letters about NEW programs not working on "such
and such" a computer, where "such and such" is either a
NON-IBM-PC-COMPATIBLE, or a legacy system. Things like that
really upset me to hear people get mad and want to RUIN a
technological improvement to maintain compatibility with ANCIENT
and/or incompatible hardware.

The way that I see it is this:

If you have an ANCIENT machine, and it WORKS, then WHY do you
need to upgrade it at all?  Surely you can't expect your
machine to be endlessly supported by ALL software!  Why should
all of TODAY's NEW computers be hampered and not used 
to their full potential, for the sole purpose of maintaining
permanent backwards compatibility with the ENIAC?

The post I just read about Turbo C 1.5 not running on the DEC
Rainbow (a NON-PC compatible as stated) is irrelevant.  If TC 1.0
works, then use it!  Why should the NEW compiler (or any
software for that matter) be FORCED to give up new technology
and new ideas in order to work on a legacy computer that is not
even IBM compatible to start with?  First there was the IBM-PC
and MSDOS, then there was a whole PILE of clones and compatibles.
Since MSDOS was not totally designed to be a portable and hardware
independant OS (ala UNIX). And since it did not shield
programmers from the hardware with an OS layer, it was
basically MADE for the *IBM-PC*.  I'm talking about MSDOS 
that shipped on IBM-PC's too, not any predecessors.  Machines
that were NOT 100% IBM compatible were just *that*; NOT
COMPATIBLE.  Those who bought INCOMPATIBLES were faced with
INCOMPATIBILITY problems.  This may still be the case in some
instances.  Solution: Buy a modern COMPATIBLE computer.  Case
closed.

From the software development side of things, a developer wants
to write software that works on the MOST number of machines. To
do that, it makes sense to stick to some sort of standard.  That
standard ended up (in this case) being the IBM-PC.  To include
code that would work on the plethora of non-compatibles, would
cause a lot of code bloat and would cause development nightmares.
You would basically be adding a "Hardware independence layer" in
all of the *applications*, instead of putting one in the *OS*
where it belongs.  Because DOS is not such a protected OS,
programmers use hardware tricks to boost performance in their
programs.  Since MOST computers are 100% compatible (in most
areas that actually matter), MOST direct hardware programming
works on ALL machines.  Whenever possible however most programs,
and their programmers, stick to defined standards.  Since a
computer made with differing hardware, usually ended up causing
users problems, people stopped buying these incompatible
computers.  Thus, many PC makers learned the hard way to not
deviate from the accepted standard, the IBM-PC.  (Ever use a
Tandy?  Prime example.  I've got one!)

By sticking to standards, we help maintain an even playing field
when developing.  When new technology or hardware comes into the
picture, new standards are needed.  These new standards sometimes
break old standards, or old technology.  If significant gains can
be made by breaking an old standard and making a new, then it
should be done.  (gains != $$$, but rather gains == flexibility,
power, robustness, improvements, etc)

In most cases, developers TRY to maintain as much backwards
compatibility with hardware and software as possible, however,
their is a point in some cases where this can't be done.  MSDOS
is an x86 OS, however TC1.5 is an "IBM-PC or 100% compatible PC
running the DOS operating system" program.  You can hardly expect
a program made for one architecture to run on another simply
because the OS's are the same.  For example: The DEC alpha
running Linux won't run i386 Linux apps.  The same OS, but
different architecture.  Sure, the recompiled sources might run,
but the binaries wont (although I'm sure that an emulator is
under way, but that is besides the point).  Likewise, you can't
EXPECT a program written for an OS such as DOS which runs
on the IBM-PC with an x86 processor to run on ANY machine with an
x86 processor.  The reason is that the HARDWARE is TOO DIFFERENT.
If all programs were made without using any direct hardware
access, then all programs would be written in ANSI C and run over
VT100 terminals and be SLOW, NON-GRAPHICAL, and BORING.
(Besides, that is what VMS is for. :o)


If I write a program and say that it is for the IBMPC running
MSDOS 6.22 and up, then I will support it for that platform.  If
someone can't get it to work on a non-compatible, then perhaps it
is time to upgrade.  I cant see a developer halting
development on a new technology or program just to make it work
on obsolete hardware.  I *CAN* see them trying too make their
programs run on as many machines as possible that stick to SOME
sort of accepted standard, or some minimum requirements.
Sometimes, those minimum requirements mean having a display that
is better than a VT100 terminal.


For example, if I write a new DOS graphical program (VGA
640x480), you can hardly *expect* me to support CGA on a Tandy
non-compatible can you?  Supporting that system will hardly open
my market up, or help me to work towards new technological
changes. The man hours wasted in adding support for a legacy
system would be much better spent adding NEW features for NEWer
systems.  Every program has a target platform, as well as MINIMUM
system requirements, and also RECOMMENDED system requirements.
People who don't have machines that meet the given requirements,
and DO require using the software will HAVE to upgrade, or else
NOT use the software.  This is the way it works in the real
world.  I personally have HAD to upgrade things in order to run
cutting edge programs, or use new hardware.  I may not have liked
spending the money to upgrade, but I certainly knew that my
computer had LIMITS, and in order to get around these limits an
upgrade was in order. (ie: PCI cards, VLB cards, video
accelerators, etc...)

The whole point of this posting is to try and make people who are
using older equipment understand WHY new programs may NOT work on
their computer, and WHY a lot of people WANT new features that
might not work on older computers.

I think I've pretty much made my point now.  If anyone cares to
send me feedback, I welcome it openly.  I'd sure like to hear
other people's opinions on the subject matter, and see what the
'general concensus' is.

flames > /dev/null  


Mike A. Harris        |             http://blackwidow.saultc.on.ca/~mharris
Computer Consultant   |    My webpage has moved and my address has changed.
My dynamic address: http://blackwidow.saultc.on.ca/~mharris/ip-address.html
mailto:mharris AT blackwidow DOT saultc DOT on DOT ca

LINUX: Lost access to your keyboard after a game?  Email me for fix.

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019