X-Authentication-Warning: delorie.com: mail set sender to geda-user-bounces using -f X-Recipient: geda-user AT delorie DOT com X-Virus-Scanned: amavisd-new at neurotica.com X-NSA-prism-xkeyscore: I do not consent to surveillance, prick X-Original-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=neurotica.com; s=default; t=1436300903; bh=AuDUNNJ35uT9PcizIcpz06iJykJP0f5qP0dpZ/pOrp0=; h=Date:From:To:Subject:References:In-Reply-To; b=FsVZQvzJeC619ye8i39RQwc3/tL6oAHjmV6qWGj83NzLzVwaoUhrmK88aO3GFAz86 jg/cBPQqtP4PNeUJR+B8YwHZ/P/QneGVuSbBVwFkw4WBeniu2sUfMrt2lk0o7QB4BD NZSEeZPxHGqw//R5bMskV+oHBml48jLA71KdPHQU= Message-ID: <559C3667.7030402@neurotica.com> Date: Tue, 07 Jul 2015 16:28:23 -0400 From: "Dave McGuire (mcguire AT neurotica DOT com) [via geda-user AT delorie DOT com]" User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:31.0) Gecko/20100101 Thunderbird/31.7.0 MIME-Version: 1.0 To: geda-user AT delorie DOT com Subject: Re: [geda-user] gEDA/gschem still alive? References: <20150703030409 DOT 32398 DOT qmail AT stuge DOT se> <1436006726 DOT 677 DOT 13 DOT camel AT ssalewski DOT de> <20150706200609 DOT GD24178 AT localhost DOT localdomain> <20150707060409 DOT GB14357 AT localhost DOT localdomain> <1436287952 DOT 678 DOT 26 DOT camel AT ssalewski DOT de> <559C0F7E DOT 7010009 AT neurotica DOT com> <20150707183339 DOT GA1817 AT alpha2> In-Reply-To: <20150707183339.GA1817@alpha2> Content-Type: text/plain; charset=windows-1252 Content-Transfer-Encoding: 8bit X-MIME-Autoconverted: from quoted-printable to 8bit by delorie.com id t67KSjnf008209 Reply-To: geda-user AT delorie DOT com On 07/07/2015 02:33 PM, Ivan Stankovic (pokemon AT fly DOT srk DOT fer DOT hr) [via geda-user AT delorie DOT com] wrote: >> This sets off some alarm bells for me. I'm a professional developer; >> I write code every day and I like to stay on top of new research, and >> I've never even heard of most of the languages you mentioned here. I've >> heard of Go, and Python, Ruby, and Java, of course, but Nim? Crystal? >> Rust? > > I'm also a professional developer and know many other professional > developers. I also know other developers who are not professional > (sorry, could not resist). ;) > Just because you haven't heard of them does not mean that the > rest of the world hasn't. Well sure...and I admit that I live squarely in the embedded world now; I usually don't write server code anymore, and the last time I wrote user-facing "app" code was 25 years ago. I'd not really have had occasion to bump into one of these languages. But it's not like I live under a rock; my professional and social groups involve programmers of all stripes...and yet these languages still haven't come up on my radar as being anything that's in actual USE anywhere. >> Locking development into somoene's pet language that will likely >> disappear into (further) obscurity in a year or two is not the way to >> ensure the longevity of a software project. > > I completely agree. Though I have to point out that it all depends > on the likelihood of "disappearing into obscurity". I predict that > not all of them will be extinct ten, or twenty years from now. One can hope, but one never knows. But how does one decide which one to "get behind"? If you choose the wrong one, all the code you've written in it effectively becomes obsolete very quickly. It's not an easy problem to solve. >> And further (and I apologize if it sounds like I'm picking on you >> here), rabid proponents of dozens of "pet" programming languages have >> claimed them to be "as fast as C!!" for decades. I didn't believe it >> then, and I don't believe it now. > > You do not need to believe anyone. Measure. Evaluate. Then draw > conclusions. I have, with Perl, Java, and C++. C is consistently the fastest and has the smallest memory footprint. People often seem to forget the fact that programming in higher-level languages usually involves importing huge blocks of code as binary blobs over which they have no control, either of what's included and what's omitted, or what's executed and what isn't. When one raises the conceptual level of programming, one (usually) sacrifices flexibility and control, and invariably people explain it away with a hand-wave by saying "oh we really didn't want all of that flexibility and control anyway, because it made us make mistakes!" Everybody makes mistakes, creates buffer overruns and bad pointer dereferences etc...but competent developers make fewer mistakes and introduce fewer bugs. Lowering the barriers of entry creates more programmers...not better ones. Somewhere along the line, I believe in the early 1970s, some idiot proclaimed that "programmer time is more precious than processor time". This created an excuse for programmer laziness that still affects us to this day, giving us such "progress" as operating systems that require gigabytes of RAM and several minutes at billions of clock cycles per second just to boot. Every time Firefox slows to a crawl or Xilinx ISE takes forever to start up, I'm reminded of this. It seems most people just accept it, paint on it the pretty face of "progress" (at any cost!) and pretend to like it. When it takes a minute and a half to open a new web browser window on a machine with six 3.2GHz cores, 48MB of cache, 24MB of main memory, and fast disks, I very quickly decide that processor time is a whole lot more precious than programmer time. Let's not push gEDA down a path to that same frustration just because the language most of it is written in actually requires some skills and thought. -Dave -- Dave McGuire, AK4HZ New Kensington, PA