delorie.com/archives/browse.cgi | search |
Actually, the big win was tokenizing strings - parsing variable-length data and allocating temporary buffers was a killer. The binary format included storing the size of arrays and strings before the data so that everything could be pre-allocated and read in chunks, and tokenizing strings really sped up reading the name portion of name=value pairs, since they were repeated so many times.
webmaster | delorie software privacy |
Copyright © 2019 by DJ Delorie | Updated Jul 2019 |