• Home
  • History
  • Annotate
  • Raw
  • Download
  • only in /barrelfish-2018-10-04/usr/eclipseclp/documents/search/

Lines Matching defs:of

4 % The contents of this file are subject to the Cisco-style Mozilla Public
6 % in compliance with the License. You may obtain a copy of the License
15 % The Initial Developer of the Original Code is Cisco Systems, Inc.
34 %% [moved into include file because of latex2html problem]
50 %%This tutorial is an overwiew of different search methods to Prolog,
61 alternative methods of searching for solutions in the presence of
63 We assume we have the standard pattern of a constraint program:
70 The model part contains the logical {\em model} of our problem. It defines
72 Every variable has a {\em domain} of values that it can take
73 (in this context, we only consider domains with a finite number of values).
76 Search is necessary since generally the implementation of the constraints
82 \item If a variable is given a value (from its domain, of course),
87 \item The {\em search space} is the set of all possible total assignments.
97 \subsection{Overview of Search Methods}
104 \caption{A search space of size 16}
108 possible total assignments, some of which are solutions.
120 assignments which represent subsets of the search space) or by moving
126 Here is table of a selection of search methods together with their properties:
162 As opposed to that, figure \ref{figmovesearch} shows an example of an
184 typical sizes of search spaces.
185 A complete exploration of a huge search space
188 whole classes of similar assignments.
189 Moreover, a complete search method must remember which parts of the
192 acceptable memory requirements if there is a simple structuring of the
193 space that allows compact encoding of sub-spaces.
230 \item {\bf Prove} that certain areas of the space contain no solutions.
231 This can be done with the help of constraints. This is often referred
233 \item {\bf Ignore} parts of the search space that are unlikely to contain
236 A heuristic is a particular traversal order of the search space
241 degrees of freedom that are available for heuristics within the framework of
246 which in {\eclipse} can be implemented using the facilities of the repair-library.
253 There is one form of tree search which is especially economic:
256 of maximum depth N for bookkeeping. Most other strategies of tree
259 of {\eclipse}. Note that the main disadvantage of the depth-first
260 strategy (the danger of going down an infinite branch) does not come
264 This is only justified when the shape of the search tree is statically fixed.
265 Our case is different: we have the freedom of deciding on the shape of every
267 allow to arrange for {\em any} order of visiting the leaves of the search tree,
275 In general, the nodes of a search tree represent {\em choices}.
281 of simpler sub-problems.
283 In the case of finite-domain problems, the most common form of choice
289 branch of the search tree and to 1 in the other.
295 Other forms of choices are possible. If X2 is a variable that can take
330 \caption{The effect of variable selection}
335 If we decide to choose values for X1 first (at the root of the search tree)
338 this corresponds to one particular order of visiting the leaves of the tree:
342 consequently the order of visiting the leaves is different:
346 this number grows exponentially with the number of variables. For 5
350 Note that the example shows something else: If the domains of the variables
351 are different, then the variable selection can change the number of internal
352 nodes in the tree (but not the number of leaves). To keep the number of nodes
361 the child nodes of a node by choosing the
362 values from the domain of a variable in a particular order.
363 Figure \ref{figvalsel} shows how this can change the order of visiting the
371 \caption{The effect of value selection}
375 By combining variable and value selection, a large number of different
377 To give an idea of the numbers involved, the following table shows the search
378 space sizes, the number of possible search space traversal orderings,
379 and the number of orderings
447 take (this is the definition of indomain/1).
456 factor at the root of the search tree and the total number of internal nodes.
474 Even different instances of the same problem can exhibit widely different
478 Chess players know that pieces in the middle of the board are more
480 placing queens in the middle of the board to reduce the number of
490 The implementation of middle\_first/2 requries a bit of list manipulation
517 the number of backtracks goes down to zero,
523 ie.\ setting the variables to values in the middle of their domain.
524 We therefore invent a variant of indomain/1 called middle\_first\_indomain/1
534 The implementation of middle\_first\_indomain/2
541 dom(X, List), % the list of values in X's domain
547 However, when looking at a number of different instances of the problem,
557 %The size of the (remaining) search space can be computed easily
559 %sizes of all the (remaining) variable's domains:
568 An interesting piece of information during program development is the
569 number of backtracks. It is a good measure for the quality of
603 Note that there are other possible ways of defining the number of backtracks.
627 One way to limit the scope of backtrack search is to keep a
628 record of the number of backtracks, and curtail the search when this
631 You can find an implementation of Bounded Backtrack Search ({\em BBS})
632 in the file {\tt lds.ecl} in the {\tt doc/examples} directory of your
635 \verb0bounded_backtrack_search0, and it takes two arguments, a list of
636 variables and an integer (the limit on the number of backtracks).
664 facilities of {\eclipse}, including {\em non-logical variables} and
696 Credit search is a tree search method where the number of
698 starting the search at the tree root with a certain integral amount of
700 between their child nodes, and so on. A single unit of credit cannot
708 implements credit search. For ease of understanding, it is
734 leaf of the search tree is reached. The leftmost alternative should
737 What is a reasonable amount of credit to give to a search?
739 number of leaf nodes that will be reached.
740 The number of leaf nodes grows exponentially with the number of
742 polynomial runtimes. A good rule of thumb could therefore be to
743 use as credit the number of variables squared or cubed, thus enforcing
747 root of the search tree and disallows choices completely below a certain
757 Another form of incomplete tree search is simply to use time-outs.
760 found so far is returned instead of the proven optimum.
795 heuristic would, of course, not require any search. However most
798 ``discrepancy'' is a measure of the degree to which it fails to follow
799 the heuristic. LDS starts searching with a discrepancy of $0$ (which
823 assignment to the problem variables specified in advance of the
824 search. The predicate supporting static LDS takes a list of variables
825 (those which are to be labelled) and a list of values (one heuristic
830 The measure of discrepancy, in this case, is simply the number of
832 discrepancy is just the number of variables to be
836 {\tt lds.ecl} in the {\tt doc/examples} directory of your
844 \item[Vars] the list of problem variables. Some of the
847 \item[Vals] the list of values according to the heuristic. It
851 \item[Discrepancy] the discrepancy of the solution returned. Typically this
852 is an output of the search (an integer between $0$ and the number of
905 variable is instantiated as a consequence of constraint propagation
911 Each variable in the list of variables {\em Vars}
920 Notice that the first solution has a discrepancy of $0$, because
961 {\eclipse}, so that for each discrepancy level only a limited number of
1004 In the following we discuss several examples of move-based (as opposed
1006 for unconstrained problems, but they work for certain classes of
1009 From a technical point of view, the main difference between tree search
1013 well with the idea of constraint propagation.
1017 We therefore need implementations of the constraints that monitor changes
1024 in the file {\tt knapsack_ls.ecl} in the {\tt doc/examples} directory of your
1032 knapsack problem. The problem is the following: given a container of
1033 a given capacity and a set of items with given weights and profit
1036 the sum of their profits is maximal.
1052 At the end of the problem modelling code, a standard branch-and-bound tree search
1053 (min_max) is invoked in the last line of the code.
1056 \item {\tt N} - the number of items (integer)
1057 \item {\tt Profits} - a list of N integers (profit per item)
1058 \item {\tt Weights} - a list of N integers (weight per item)
1059 \item {\tt Capacity} - the capacity of the knapsack (integer)
1064 At the end, we invoke a local search routine instead of tree search:
1089 delivered as the tentative value of Result. Any change of tentative
1090 value inside the ArithExpression leads to an update of the Result.
1091 In the example, the computation of the objective function has
1138 As a simple example of local search, let us look at a random walk
1139 strategy. The idea is to start from a random tentative assignment of
1143 capacity left. We do a fixed number (MaxIter) of such steps and keep
1144 track of the best solution encountered.
1146 Each step consists of
1148 \item Changing the tentative value of some variable, which in turn causes
1149 the automatic recomputation of the conflict constraint set
1156 up as explained above. The violation of the capacity constraint
1161 of the Profit variable (which is being constantly updated by tent\_is).
1187 of all variables in the array randomly to 0 or 1:
1189 a tentative value of 0 to 1, or vice versa.
1190 Note that we are using an array, rather than a list of variables, to
1193 in the file {\tt knapsack_ls.ecl} in the {\tt doc/examples} directory of your
1201 The following hill-climbing implementation is an instance of the nested
1241 from the array of variables and changes its tentative value from 0 to 1
1254 are within the condition part of the if-then-else construct.
1259 value of the Profit-variable before and after the move is done.
1260 Remember that, since the move operator changes the tentative values of
1272 Simulated Annealing is a slightly more complex variant of local search.
1279 done without improvement of the objective.
1325 Another variant of local search is tabu search. Here, a number of moves
1330 concrete instances of this basic idea. For example, how a move would be
1335 the parameter TabuSize. The local moves consist of either adding
1376 search algorithm is built. An example of this is applying tabu search to
1391 Weak-commitment Search (WCS) can be seen as one of the simplest form of
1398 assignments to the variables of the problem. Labelling of the variables are
1406 combination of values lead to a dead-end, and will not be tried again in
1410 `weak-commitment' refers to this feature of the search technique, wherein
1413 be `stuck' exhaustively exploring a particular region of a search-space
1417 The min-conflict heuristic of Minton et al. is the heuristic used to
1419 a value to a variable, in which all the values in the remaining domain of the
1423 values of the still unlabelled variables.
1429 in the file {\tt wcs.ecl} in the {\tt doc/examples} directory of your
1440 where \verb'Vars' is a list of variables that are to be labelled by the
1441 search, and \verb'Initial' is a list of the initial tentative values that
1447 Two example usage of the search are given: 1) a search for potentially all
1450 in the form of Prolog facts.
1455 additional constraints on the placement of the queens have to be specified
1459 tentative and actual values of the variables. Tentative values are
1468 \subsubsection{Implementation of WCS}
1472 implementation in {\eclipse} of the basic algorithm presented by Yokoo.
1482 tentative values of the variables. This is done via the \verb'r_prop'
1484 \item Setting the tentative value of a variable, either initially, or
1486 \item Counting the number of constraint violations on the
1492 The predicate also has to implement the probing and restart steps of the
1495 value which leads to the least number of conflicts (constraint violations)
1507 The above tries out the available values of \verb'Var', collect the
1510 of such constraints using \verb'length/2'. The value with the minimum
1511 number of constraint violation is selected as the binding to \verb'Var' by
1519 specifically creating a choice-point at the start of the search, in the
1541 clause of \verb'do_search/2' will be tried, in which a new search is
1544 have been generated. To allow for the search of more solutions, this
1545 solution is remembered as a nogood in the first clause of
1548 The main difficulty with implementing restart is to remember the values of
1549 labelled variables so that it can be added as a nogood. The addition of the
1558 non-logical variable feature of {\eclipse}, which allows copies of
1568 value of this variable can then be set via \verb'setval/2' and accessed
1601 not, \verb'label(Var)' fails, and the else case of the if-then-else is
1636 cannot take on the values of \verb'1', \verb'2' and \verb'3' respectively
1638 and take on the values \verb'[1,2,3]'. If any of the variables take on a
1658 The next variable to be labelled is chosen from the set of variables whose
1661 causing conflict will occur in these constraints. The set of conflicting
1667 valid variable to be labelled, i.e.\ that it is one of the variables to be
1669 routine will be used as part of a larger program, and the program may use
1674 \subsubsection{Improving Implementation of Nogoods}
1684 value can be propagated. This increase the efficiency of the search in many
1685 cases, although at the price of a slightly more complex implementation
1686 of the nogood constraint.
1714 at least one of them gets instantiated.
1716 Obviously, this is still a relatively naive implementation of the
1717 nogood-technique. As the number of nogoods grows, implementing them
1719 optimisation techniques like merging and indexing of the nogoods