X-Authentication-Warning: delorie.com: mail set sender to geda-user-bounces using -f Date: Fri, 11 Sep 2015 12:07:45 -0400 Message-Id: <201509111607.t8BG7jio032558@envy.delorie.com> From: DJ Delorie To: geda-user AT delorie DOT com In-reply-to: (message from Kai-Martin Knaak on Fri, 11 Sep 2015 12:10:59 +0200) Subject: Re: [geda-user] Notice to developers: layers have now a type. References: <55F1F75F DOT 8010809 AT jump-ing DOT de> Reply-To: geda-user AT delorie DOT com Errors-To: nobody AT delorie DOT com X-Mailing-List: geda-user AT delorie DOT com X-Unsubscribes-To: listserv AT delorie DOT com Precedence: bulk > > "copper", /* LT_COPPER */ > > "silk", /* LT_SILK */ > (...) > > These are exclusive, are they? If you mean you can't use silkscreen dye to make copper traces, yes :-) The LT_* are layer types, enums. Not bitmask flags. > > "invisible", /* LT_INVISIBLE */ > > What is the use case for invisible? What process should (not) see this > layer? This might be a hold-over from the internal "invisible" type of layer. For example, if you're looking at the top silk, the bottom silk is drawn as "invisible" as are bottom elements. Some of the exporters need to know this to export the board correctly. > I imagine, this there is a potential for increased speed, if common > layer selections were calculated and cached in advance. If you have thousands of layers, sure. But we currently have at most 16. Iterating over 16 things is not going to be a bottleneck, and certainly not worth making the code more complicated (and less maintainable). Most of the CPU performance is in moving layer data between the internal representation and the graphics card.