1\chapter{Future Work}
2\label{chap:future_work}
3\epigraph{Follow!  But! follow only if ye be men of valor, for the entrance
4  to this cave is guarded by a creature so foul, so cruel that no man
5  yet has fought with it and lived!  Bones of four fifty men lie strewn
6  about its lair. So, brave knights, if you do doubt your courage or
7  your strength, come no further, for death awaits you all with nasty
8  big pointy teeth.}{Monty Python}
9
10This is going to look like a brain dump, despite any effort to make it
11understandable by the Outside World.
12
13\begin{description}
14\item[Module import clean-up:] for historical reasons, some imports
15  might be completely useless now. Similarly, imports such as
16  |Debug.Trace| should disappear too ;
17\item[Paka terms with real holes:] in Section~\ref{sec:il_paka_paka},
18  we have seen that Paka terms are ignoring most of their holes by
19  using hard-coded values ;
20\item[More efficient redundant assignment optimizer:] in
21  Chapter~\ref{sec:il_paka_optimizer}, we have seen that the optimizer
22  is quite conservative, making it quite useless in practice ;
23\item[Supporting function pointers:] preventing Filet-o-Fish users to
24  abuse function pointers is a violation to Geneva convention. I do
25  not think that there is some deep technical difficulty to get
26  that. But printing the type of such pointer was a first trouble, if
27  I remember correctly ;
28\item[Implementing the interpreter in the Agda language:] this was
29  already one of my goal initially, but the NICTA people insisted that
30  without an in-theorem-prover semantics, the dependability argument is just
31  bullsh*t. Ha, these Australians\ldots ;
32\item[Code generator back-back-end:] following the steps of FoF and
33  Paka, we need a more principled back-back-end, generating (correct)
34  out of |FoFCode| ;
35\item[Hoopl-based optimization framework:] the
36  Hoopl~\cite{ramsey-hoopl} framework is a promising tool to implement
37  any kind of data-flow analysis and optimization. Instead of
38  developing our own crappy optimizer, we should use that stuff, when
39  the source is released. This is the reason why @IL.Paka.Optimizer@
40  is such a joke: it \emph{must} be dropped asap ;
41\item[Translation validation infrastructure:] because we claim
42  dependability but our compiler is such a tricky mess, we need a good
43  bodyguard. Translation validation~\cite{necula-tvi} is an affordable
44  technique that tells you, when you run your compiler, if it has
45  barfed (and where), or not. If it has not failed, then you know for
46  sure that the generated code is correct ;
47\item[More stringent syntactic tests:] it is very easy to build
48  ill-formed Filet-o-Fish terms, because the types of constructs have
49  not been engineered to ensure their invariants, and there is little
50  or no run-time checks. It is just a matter of putting more run-time
51  checks, a lot more ;
52\item[Compiling to macros:] that's an interesting topic: we are able
53  to generate C code. We might need to generate C macro at some
54  point. How would that fit into Filet-o-Fish?
55\item[Compiling with assertions:] assuming that Filet-o-Fish-generated
56  C code is correct, we are ensured that it must never failed at
57  run-time, except if it is provided with bogus input data. Being able
58  to specify what is a valid input data and translating that into
59  assertions might be useful. Similarly, when reading in an array, for
60  example, we probably want to ensure that we are not going out of
61  bounds, and an assert should fail if this is the case.
62\end{description}
63