delorie.com/archives/browse.cgi   search  
Mail Archives: geda-user/2013/09/01/17:22:50

X-Authentication-Warning: delorie.com: mail set sender to geda-user-bounces using -f
X-Recipient: geda-user AT delorie DOT com
Date: Sun, 1 Sep 2013 14:21:59 -0700
From: Larry Doolittle <ldoolitt AT recycle DOT lbl DOT gov>
To: geda-user AT delorie DOT com
Subject: Re: [geda-user] VE
Message-ID: <20130901212159.GA19934@recycle.lbl.gov>
References: <CAGRhJMag+hNGutkiG6Mgr9+vxaSynkNEajR0HE2gs8GeZM0o0Q AT mail DOT gmail DOT com>
<20130901043811 DOT GA18909 AT recycle DOT lbl DOT gov>
<1E387459-44FD-4E10-96EB-1D0E787B94D6 AT noqsi DOT com>
MIME-Version: 1.0
In-Reply-To: <1E387459-44FD-4E10-96EB-1D0E787B94D6@noqsi.com>
User-Agent: Mutt/1.5.20 (2009-06-14)
Reply-To: geda-user AT delorie DOT com
Errors-To: nobody AT delorie DOT com
X-Mailing-List: geda-user AT delorie DOT com
X-Unsubscribes-To: listserv AT delorie DOT com

Guys -

On Sun, Sep 01, 2013 at 12:47:54PM -0600, John Doty wrote:
> On Aug 31, 2013, at 10:38 PM, Larry Doolittle wrote:
> > I'm very interested in being
> > able to write generic numeric code, have it simulate (at first)
> > at "infinite" precision, then establish real-life bounds and precision
> > needs based on SNR goals, resulting in concrete scaled-fixed-point
> > variables.  That is well beyond existing language capabilities.
> Well, you can't really simulate at infinite precision. However, you *can* do algebraic circuit analysis at infinite precision. That's what gnetlist -g mathematica is for.

Double-precision floating-point counts as "infinite" precision in
my world.  But I want to see simulations at that abstraction level
work (e.g., pass regression tests) before I (or Free Software under
my control) decides that variable x can be represented as an 18-bit
word with binary point between bits 14 and 15, without degrading the
S/N of result y by more than 0.2 dB.

Without having any such fancy software to help, I do what I think
is typical in industry:  prototype my DSP in Octave, then transcribe
to Verilog, estimating the required scaling and precision by hand,
and start the debugging all over again.

   - Larry

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019