I already had support for embedding the results of SQL queries as a (formatted) html table, handling, transparently, column sorting and pagination. I wanted to add support for javascript-enabled browsers to request data via AJAX, so as to not require a full page refresh (this becomes a bigger win when you have more than one such table on a page).
In order to support this I had to generate JSON to represent the table. There are a number of libraries in quicklisp that can generate JSON for you, but the ones I looked at initially expected you to provide data for conversion in the form of lisp data. The natural way of representing the table data is as an array-of-arrays, but that isn't the natural way to generate it; I stream the data from the database row-by-row, generating the strings representing the cell contents, I didn't really want to hang on to all of those strings in vectors, simply to throw them away at the end (the unformatted JSON for one of my test pages comes to 43kb, the in-memory layout would be even worse).
I opted to do the generation by-hand because this way I could stream the output as I streamed the input, but now I would like to reify that code into a library.
Consider the following example of the code I would like to be able to write:
Code: Select all
(json-arr
(dotimes (i 10)
(json-val i))
(json-arr
(json-val 'hi-there)
(json-val 20))))
Code: Select all
(defmacro json-arr (&body body)
(bind ((:symbols seen))
`(progn
(write-string "[" *json-stream*)
(let (,seen)
(macrolet ((json-arr (&body body)
`(progn
(if ,',seen
(write-string ", " *json-stream*)
(setf ,',seen t))
(json-arr ,@body)))
;; similar definitions for json-obj and json-val would also go here
)
,@body))
(write-string "]" *json-stream*))))
If FLET wasn't in common-lisp you could imitate the effect by using two layers of LABELS and a gensym, where the outer layer uses the gensym to bind a function that calls the global function, and the inner LABELS redefines the named function, using the gensym to invoke the call it is rebinding.
My first attempt to solve my MACROLET problem (foolishly) imitated the above by substituting double MACROLET, this resulted in the same infinite recursion (because, unlike function invocation, macroexpansion is always done with the innermost environment).
I see a few ways to proceed:
- Interpose a different symbol (e.g. JSON-EL) to signal where a comma might be needed, so that JSON-ARR doesn't need to be re-defined. Con: makes client-code more verbose, adds indentation
- Write the macros using symbols that won't be re-bound (e.g. %json-arr), and then copy into the public symbols (via (SETF (MACRO-FUNCTION 'json-arr) ...), or just (DEFMACRO json-arr (&body body) `(%json-arr ,@body))). Con: these other symbols would appear in macro-expansions, MACROLET is also a PITA
- Use a special variable to communicate between macros that they may need to inject a comma (requires manually running the expanders with MACROEXPAND). Doesn't work, because MACROEXPAND doesn't expand subforms, and by the time the subforms get expanded the special var is no longer bound.