delorie.com/archives/browse.cgi   search  
Mail Archives: djgpp/1998/01/23/19:15:26

From: punjab AT ix DOT netcom DOT com (//)
Newsgroups: comp.os.msdos.djgpp
Subject: Re: The meaning of O(n)...
Date: 23 Jan 1998 22:34:38 GMT
Organization: Netcom
Lines: 62
Message-ID: <6ab5tu$cqb@dfw-ixnews11.ix.netcom.com>
References: <Pine DOT GSO DOT 3 DOT 96 DOT 980120190605 DOT 19121B-100000 AT bert DOT eecs DOT uic DOT edu> <34C8B7D1 DOT A19B36EA AT pentek DOT com>
NNTP-Posting-Host: stk-ca1-22.ix.netcom.com
To: djgpp AT delorie DOT com
DJ-Gateway: from newsgroup comp.os.msdos.djgpp

In <34C8B7D1 DOT A19B36EA AT pentek DOT com> Charles Krug <charles AT pentek DOT com>
writes: 
>
>Andrew Deren wrote:
>
>> On 20 Jan 1998, Gili wrote:
>>
>> > Hi,
>> >
>> >       I have noticed that lots algorithms provide their efficency
as O(n),
>> > where N is the number of elements they are operating on. The last
math
>> > course I have taken is Calculus 2 and I do not recall ever seeing
the
>> > O(n) function. What does it mean? How efficient is something with
>> > O(n)? Thanks in advanced,
>>
>> O(n) is what computer scienties use to define the running time or an
>> algorithm.
>
>It's actually from discrete math, and is usually called the
"complexity" of an
>algorithm.  There is quite a bit of math behind it, but what it really
means is
>this.
>
>Suppose that you have an algorithm that takes some function of n to
execute,
>where n is the number of things being operated on.  Take, for example,
this
>function:
>                                                                n^2 +
nlog(n) +
>3000.
>
>If n is allowed to grow very large, (how large depends upon the
constants), one
>term will dominate.  That term is used to describe the "complexity" of
an
>algorithm.  They are customarily listed in this order:
>
>2^n, n!, n^3, n^2, nlog(n), n, log(n), 1.
>
>If you look at a bubble sort, you realize that you have n outer loops,
each
>containing (n-1) inner loops,  giving an expanded polynomial of (n^2
-n).  This
>therefore has the complexity n^2.  If you look at quicksort, you see
log(n)
>outer loops, each containing (n-1) inner loops, giving a complexity of
>log(n)(n-1) or nlog(n) - log(n).  Which has a complexity of nlog(n).
>
>
>--
>Charles Krug, Jr.
>
>

Also be aware that that is the worst-case condition for the algorithm. 
For example an O(NlogN) algorithm may be more efficient than an O(N^2)
algorithm if N<N' where N' is where N(alg 1, t) = N(alg 2, t). 

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019