reorg.c revision 18334
118334Speter/* Perform instruction reorganizations for delay slot filling. 218334Speter Copyright (C) 1992, 1993, 1994, 1995 Free Software Foundation, Inc. 318334Speter Contributed by Richard Kenner (kenner@vlsi1.ultra.nyu.edu). 418334Speter Hacked by Michael Tiemann (tiemann@cygnus.com). 518334Speter 618334SpeterThis file is part of GNU CC. 718334Speter 818334SpeterGNU CC is free software; you can redistribute it and/or modify 918334Speterit under the terms of the GNU General Public License as published by 1018334Speterthe Free Software Foundation; either version 2, or (at your option) 1118334Speterany later version. 1218334Speter 1318334SpeterGNU CC is distributed in the hope that it will be useful, 1418334Speterbut WITHOUT ANY WARRANTY; without even the implied warranty of 1518334SpeterMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the 1618334SpeterGNU General Public License for more details. 1718334Speter 1818334SpeterYou should have received a copy of the GNU General Public License 1918334Speteralong with GNU CC; see the file COPYING. If not, write to 2018334Speterthe Free Software Foundation, 59 Temple Place - Suite 330, 2118334SpeterBoston, MA 02111-1307, USA. */ 2218334Speter 2318334Speter/* Instruction reorganization pass. 2418334Speter 2518334Speter This pass runs after register allocation and final jump 2618334Speter optimization. It should be the last pass to run before peephole. 2718334Speter It serves primarily to fill delay slots of insns, typically branch 2818334Speter and call insns. Other insns typically involve more complicated 2918334Speter interactions of data dependencies and resource constraints, and 3018334Speter are better handled by scheduling before register allocation (by the 3118334Speter function `schedule_insns'). 3218334Speter 3318334Speter The Branch Penalty is the number of extra cycles that are needed to 3418334Speter execute a branch insn. On an ideal machine, branches take a single 3518334Speter cycle, and the Branch Penalty is 0. Several RISC machines approach 3618334Speter branch delays differently: 3718334Speter 3818334Speter The MIPS and AMD 29000 have a single branch delay slot. Most insns 3918334Speter (except other branches) can be used to fill this slot. When the 4018334Speter slot is filled, two insns execute in two cycles, reducing the 4118334Speter branch penalty to zero. 4218334Speter 4318334Speter The Motorola 88000 conditionally exposes its branch delay slot, 4418334Speter so code is shorter when it is turned off, but will run faster 4518334Speter when useful insns are scheduled there. 4618334Speter 4718334Speter The IBM ROMP has two forms of branch and call insns, both with and 4818334Speter without a delay slot. Much like the 88k, insns not using the delay 4918334Speter slot can be shorted (2 bytes vs. 4 bytes), but will run slowed. 5018334Speter 5118334Speter The SPARC always has a branch delay slot, but its effects can be 5218334Speter annulled when the branch is not taken. This means that failing to 5318334Speter find other sources of insns, we can hoist an insn from the branch 5418334Speter target that would only be safe to execute knowing that the branch 5518334Speter is taken. 5618334Speter 5718334Speter The HP-PA always has a branch delay slot. For unconditional branches 5818334Speter its effects can be annulled when the branch is taken. The effects 5918334Speter of the delay slot in a conditional branch can be nullified for forward 6018334Speter taken branches, or for untaken backward branches. This means 6118334Speter we can hoist insns from the fall-through path for forward branches or 6218334Speter steal insns from the target of backward branches. 6318334Speter 6418334Speter Three techniques for filling delay slots have been implemented so far: 6518334Speter 6618334Speter (1) `fill_simple_delay_slots' is the simplest, most efficient way 6718334Speter to fill delay slots. This pass first looks for insns which come 6818334Speter from before the branch and which are safe to execute after the 6918334Speter branch. Then it searches after the insn requiring delay slots or, 7018334Speter in the case of a branch, for insns that are after the point at 7118334Speter which the branch merges into the fallthrough code, if such a point 7218334Speter exists. When such insns are found, the branch penalty decreases 7318334Speter and no code expansion takes place. 7418334Speter 7518334Speter (2) `fill_eager_delay_slots' is more complicated: it is used for 7618334Speter scheduling conditional jumps, or for scheduling jumps which cannot 7718334Speter be filled using (1). A machine need not have annulled jumps to use 7818334Speter this strategy, but it helps (by keeping more options open). 7918334Speter `fill_eager_delay_slots' tries to guess the direction the branch 8018334Speter will go; if it guesses right 100% of the time, it can reduce the 8118334Speter branch penalty as much as `fill_simple_delay_slots' does. If it 8218334Speter guesses wrong 100% of the time, it might as well schedule nops (or 8318334Speter on the m88k, unexpose the branch slot). When 8418334Speter `fill_eager_delay_slots' takes insns from the fall-through path of 8518334Speter the jump, usually there is no code expansion; when it takes insns 8618334Speter from the branch target, there is code expansion if it is not the 8718334Speter only way to reach that target. 8818334Speter 8918334Speter (3) `relax_delay_slots' uses a set of rules to simplify code that 9018334Speter has been reorganized by (1) and (2). It finds cases where 9118334Speter conditional test can be eliminated, jumps can be threaded, extra 9218334Speter insns can be eliminated, etc. It is the job of (1) and (2) to do a 9318334Speter good job of scheduling locally; `relax_delay_slots' takes care of 9418334Speter making the various individual schedules work well together. It is 9518334Speter especially tuned to handle the control flow interactions of branch 9618334Speter insns. It does nothing for insns with delay slots that do not 9718334Speter branch. 9818334Speter 9918334Speter On machines that use CC0, we are very conservative. We will not make 10018334Speter a copy of an insn involving CC0 since we want to maintain a 1-1 10118334Speter correspondence between the insn that sets and uses CC0. The insns are 10218334Speter allowed to be separated by placing an insn that sets CC0 (but not an insn 10318334Speter that uses CC0; we could do this, but it doesn't seem worthwhile) in a 10418334Speter delay slot. In that case, we point each insn at the other with REG_CC_USER 10518334Speter and REG_CC_SETTER notes. Note that these restrictions affect very few 10618334Speter machines because most RISC machines with delay slots will not use CC0 10718334Speter (the RT is the only known exception at this point). 10818334Speter 10918334Speter Not yet implemented: 11018334Speter 11118334Speter The Acorn Risc Machine can conditionally execute most insns, so 11218334Speter it is profitable to move single insns into a position to execute 11318334Speter based on the condition code of the previous insn. 11418334Speter 11518334Speter The HP-PA can conditionally nullify insns, providing a similar 11618334Speter effect to the ARM, differing mostly in which insn is "in charge". */ 11718334Speter 11818334Speter#include <stdio.h> 11918334Speter#include "config.h" 12018334Speter#include "rtl.h" 12118334Speter#include "insn-config.h" 12218334Speter#include "conditions.h" 12318334Speter#include "hard-reg-set.h" 12418334Speter#include "basic-block.h" 12518334Speter#include "regs.h" 12618334Speter#include "insn-flags.h" 12718334Speter#include "recog.h" 12818334Speter#include "flags.h" 12918334Speter#include "output.h" 13018334Speter#include "obstack.h" 13118334Speter#include "insn-attr.h" 13218334Speter 13318334Speter#ifdef DELAY_SLOTS 13418334Speter 13518334Speter#define obstack_chunk_alloc xmalloc 13618334Speter#define obstack_chunk_free free 13718334Speter 13818334Speter#ifndef ANNUL_IFTRUE_SLOTS 13918334Speter#define eligible_for_annul_true(INSN, SLOTS, TRIAL, FLAGS) 0 14018334Speter#endif 14118334Speter#ifndef ANNUL_IFFALSE_SLOTS 14218334Speter#define eligible_for_annul_false(INSN, SLOTS, TRIAL, FLAGS) 0 14318334Speter#endif 14418334Speter 14518334Speter/* Insns which have delay slots that have not yet been filled. */ 14618334Speter 14718334Speterstatic struct obstack unfilled_slots_obstack; 14818334Speterstatic rtx *unfilled_firstobj; 14918334Speter 15018334Speter/* Define macros to refer to the first and last slot containing unfilled 15118334Speter insns. These are used because the list may move and its address 15218334Speter should be recomputed at each use. */ 15318334Speter 15418334Speter#define unfilled_slots_base \ 15518334Speter ((rtx *) obstack_base (&unfilled_slots_obstack)) 15618334Speter 15718334Speter#define unfilled_slots_next \ 15818334Speter ((rtx *) obstack_next_free (&unfilled_slots_obstack)) 15918334Speter 16018334Speter/* This structure is used to indicate which hardware resources are set or 16118334Speter needed by insns so far. */ 16218334Speter 16318334Speterstruct resources 16418334Speter{ 16518334Speter char memory; /* Insn sets or needs a memory location. */ 16618334Speter char unch_memory; /* Insn sets of needs a "unchanging" MEM. */ 16718334Speter char volatil; /* Insn sets or needs a volatile memory loc. */ 16818334Speter char cc; /* Insn sets or needs the condition codes. */ 16918334Speter HARD_REG_SET regs; /* Which registers are set or needed. */ 17018334Speter}; 17118334Speter 17218334Speter/* Macro to clear all resources. */ 17318334Speter#define CLEAR_RESOURCE(RES) \ 17418334Speter do { (RES)->memory = (RES)->unch_memory = (RES)->volatil = (RES)->cc = 0; \ 17518334Speter CLEAR_HARD_REG_SET ((RES)->regs); } while (0) 17618334Speter 17718334Speter/* Indicates what resources are required at the beginning of the epilogue. */ 17818334Speterstatic struct resources start_of_epilogue_needs; 17918334Speter 18018334Speter/* Indicates what resources are required at function end. */ 18118334Speterstatic struct resources end_of_function_needs; 18218334Speter 18318334Speter/* Points to the label before the end of the function. */ 18418334Speterstatic rtx end_of_function_label; 18518334Speter 18618334Speter/* This structure is used to record liveness information at the targets or 18718334Speter fallthrough insns of branches. We will most likely need the information 18818334Speter at targets again, so save them in a hash table rather than recomputing them 18918334Speter each time. */ 19018334Speter 19118334Speterstruct target_info 19218334Speter{ 19318334Speter int uid; /* INSN_UID of target. */ 19418334Speter struct target_info *next; /* Next info for same hash bucket. */ 19518334Speter HARD_REG_SET live_regs; /* Registers live at target. */ 19618334Speter int block; /* Basic block number containing target. */ 19718334Speter int bb_tick; /* Generation count of basic block info. */ 19818334Speter}; 19918334Speter 20018334Speter#define TARGET_HASH_PRIME 257 20118334Speter 20218334Speter/* Define the hash table itself. */ 20318334Speterstatic struct target_info **target_hash_table; 20418334Speter 20518334Speter/* For each basic block, we maintain a generation number of its basic 20618334Speter block info, which is updated each time we move an insn from the 20718334Speter target of a jump. This is the generation number indexed by block 20818334Speter number. */ 20918334Speter 21018334Speterstatic int *bb_ticks; 21118334Speter 21218334Speter/* Mapping between INSN_UID's and position in the code since INSN_UID's do 21318334Speter not always monotonically increase. */ 21418334Speterstatic int *uid_to_ruid; 21518334Speter 21618334Speter/* Highest valid index in `uid_to_ruid'. */ 21718334Speterstatic int max_uid; 21818334Speter 21918334Speterstatic void mark_referenced_resources PROTO((rtx, struct resources *, int)); 22018334Speterstatic void mark_set_resources PROTO((rtx, struct resources *, int, int)); 22118334Speterstatic int stop_search_p PROTO((rtx, int)); 22218334Speterstatic int resource_conflicts_p PROTO((struct resources *, 22318334Speter struct resources *)); 22418334Speterstatic int insn_references_resource_p PROTO((rtx, struct resources *, int)); 22518334Speterstatic int insn_sets_resources_p PROTO((rtx, struct resources *, int)); 22618334Speterstatic rtx find_end_label PROTO((void)); 22718334Speterstatic rtx emit_delay_sequence PROTO((rtx, rtx, int, int)); 22818334Speterstatic rtx add_to_delay_list PROTO((rtx, rtx)); 22918334Speterstatic void delete_from_delay_slot PROTO((rtx)); 23018334Speterstatic void delete_scheduled_jump PROTO((rtx)); 23118334Speterstatic void note_delay_statistics PROTO((int, int)); 23218334Speterstatic rtx optimize_skip PROTO((rtx)); 23318334Speterstatic int get_jump_flags PROTO((rtx, rtx)); 23418334Speterstatic int rare_destination PROTO((rtx)); 23518334Speterstatic int mostly_true_jump PROTO((rtx, rtx)); 23618334Speterstatic rtx get_branch_condition PROTO((rtx, rtx)); 23718334Speterstatic int condition_dominates_p PROTO((rtx, rtx)); 23818334Speterstatic rtx steal_delay_list_from_target PROTO((rtx, rtx, rtx, rtx, 23918334Speter struct resources *, 24018334Speter struct resources *, 24118334Speter struct resources *, 24218334Speter int, int *, int *, rtx *)); 24318334Speterstatic rtx steal_delay_list_from_fallthrough PROTO((rtx, rtx, rtx, rtx, 24418334Speter struct resources *, 24518334Speter struct resources *, 24618334Speter struct resources *, 24718334Speter int, int *, int *)); 24818334Speterstatic void try_merge_delay_insns PROTO((rtx, rtx)); 24918334Speterstatic rtx redundant_insn PROTO((rtx, rtx, rtx)); 25018334Speterstatic int own_thread_p PROTO((rtx, rtx, int)); 25118334Speterstatic int find_basic_block PROTO((rtx)); 25218334Speterstatic void update_block PROTO((rtx, rtx)); 25318334Speterstatic int reorg_redirect_jump PROTO((rtx, rtx)); 25418334Speterstatic void update_reg_dead_notes PROTO((rtx, rtx)); 25518334Speterstatic void update_reg_unused_notes PROTO((rtx, rtx)); 25618334Speterstatic void update_live_status PROTO((rtx, rtx)); 25718334Speterstatic rtx next_insn_no_annul PROTO((rtx)); 25818334Speterstatic void mark_target_live_regs PROTO((rtx, struct resources *)); 25918334Speterstatic void fill_simple_delay_slots PROTO((rtx, int)); 26018334Speterstatic rtx fill_slots_from_thread PROTO((rtx, rtx, rtx, rtx, int, int, 26118334Speter int, int, int, int *)); 26218334Speterstatic void fill_eager_delay_slots PROTO((rtx)); 26318334Speterstatic void relax_delay_slots PROTO((rtx)); 26418334Speterstatic void make_return_insns PROTO((rtx)); 26518334Speterstatic int redirect_with_delay_slots_safe_p PROTO ((rtx, rtx, rtx)); 26618334Speterstatic int redirect_with_delay_list_safe_p PROTO ((rtx, rtx, rtx)); 26718334Speter 26818334Speter/* Given X, some rtl, and RES, a pointer to a `struct resource', mark 26918334Speter which resources are references by the insn. If INCLUDE_CALLED_ROUTINE 27018334Speter is TRUE, resources used by the called routine will be included for 27118334Speter CALL_INSNs. */ 27218334Speter 27318334Speterstatic void 27418334Spetermark_referenced_resources (x, res, include_delayed_effects) 27518334Speter register rtx x; 27618334Speter register struct resources *res; 27718334Speter register int include_delayed_effects; 27818334Speter{ 27918334Speter register enum rtx_code code = GET_CODE (x); 28018334Speter register int i, j; 28118334Speter register char *format_ptr; 28218334Speter 28318334Speter /* Handle leaf items for which we set resource flags. Also, special-case 28418334Speter CALL, SET and CLOBBER operators. */ 28518334Speter switch (code) 28618334Speter { 28718334Speter case CONST: 28818334Speter case CONST_INT: 28918334Speter case CONST_DOUBLE: 29018334Speter case PC: 29118334Speter case SYMBOL_REF: 29218334Speter case LABEL_REF: 29318334Speter return; 29418334Speter 29518334Speter case SUBREG: 29618334Speter if (GET_CODE (SUBREG_REG (x)) != REG) 29718334Speter mark_referenced_resources (SUBREG_REG (x), res, 0); 29818334Speter else 29918334Speter { 30018334Speter int regno = REGNO (SUBREG_REG (x)) + SUBREG_WORD (x); 30118334Speter int last_regno = regno + HARD_REGNO_NREGS (regno, GET_MODE (x)); 30218334Speter for (i = regno; i < last_regno; i++) 30318334Speter SET_HARD_REG_BIT (res->regs, i); 30418334Speter } 30518334Speter return; 30618334Speter 30718334Speter case REG: 30818334Speter for (i = 0; i < HARD_REGNO_NREGS (REGNO (x), GET_MODE (x)); i++) 30918334Speter SET_HARD_REG_BIT (res->regs, REGNO (x) + i); 31018334Speter return; 31118334Speter 31218334Speter case MEM: 31318334Speter /* If this memory shouldn't change, it really isn't referencing 31418334Speter memory. */ 31518334Speter if (RTX_UNCHANGING_P (x)) 31618334Speter res->unch_memory = 1; 31718334Speter else 31818334Speter res->memory = 1; 31918334Speter res->volatil = MEM_VOLATILE_P (x); 32018334Speter 32118334Speter /* Mark registers used to access memory. */ 32218334Speter mark_referenced_resources (XEXP (x, 0), res, 0); 32318334Speter return; 32418334Speter 32518334Speter case CC0: 32618334Speter res->cc = 1; 32718334Speter return; 32818334Speter 32918334Speter case UNSPEC_VOLATILE: 33018334Speter case ASM_INPUT: 33118334Speter /* Traditional asm's are always volatile. */ 33218334Speter res->volatil = 1; 33318334Speter return; 33418334Speter 33518334Speter case ASM_OPERANDS: 33618334Speter res->volatil = MEM_VOLATILE_P (x); 33718334Speter 33818334Speter /* For all ASM_OPERANDS, we must traverse the vector of input operands. 33918334Speter We can not just fall through here since then we would be confused 34018334Speter by the ASM_INPUT rtx inside ASM_OPERANDS, which do not indicate 34118334Speter traditional asms unlike their normal usage. */ 34218334Speter 34318334Speter for (i = 0; i < ASM_OPERANDS_INPUT_LENGTH (x); i++) 34418334Speter mark_referenced_resources (ASM_OPERANDS_INPUT (x, i), res, 0); 34518334Speter return; 34618334Speter 34718334Speter case CALL: 34818334Speter /* The first operand will be a (MEM (xxx)) but doesn't really reference 34918334Speter memory. The second operand may be referenced, though. */ 35018334Speter mark_referenced_resources (XEXP (XEXP (x, 0), 0), res, 0); 35118334Speter mark_referenced_resources (XEXP (x, 1), res, 0); 35218334Speter return; 35318334Speter 35418334Speter case SET: 35518334Speter /* Usually, the first operand of SET is set, not referenced. But 35618334Speter registers used to access memory are referenced. SET_DEST is 35718334Speter also referenced if it is a ZERO_EXTRACT or SIGN_EXTRACT. */ 35818334Speter 35918334Speter mark_referenced_resources (SET_SRC (x), res, 0); 36018334Speter 36118334Speter x = SET_DEST (x); 36218334Speter if (GET_CODE (x) == SIGN_EXTRACT || GET_CODE (x) == ZERO_EXTRACT) 36318334Speter mark_referenced_resources (x, res, 0); 36418334Speter else if (GET_CODE (x) == SUBREG) 36518334Speter x = SUBREG_REG (x); 36618334Speter if (GET_CODE (x) == MEM) 36718334Speter mark_referenced_resources (XEXP (x, 0), res, 0); 36818334Speter return; 36918334Speter 37018334Speter case CLOBBER: 37118334Speter return; 37218334Speter 37318334Speter case CALL_INSN: 37418334Speter if (include_delayed_effects) 37518334Speter { 37618334Speter /* A CALL references memory, the frame pointer if it exists, the 37718334Speter stack pointer, any global registers and any registers given in 37818334Speter USE insns immediately in front of the CALL. 37918334Speter 38018334Speter However, we may have moved some of the parameter loading insns 38118334Speter into the delay slot of this CALL. If so, the USE's for them 38218334Speter don't count and should be skipped. */ 38318334Speter rtx insn = PREV_INSN (x); 38418334Speter rtx sequence = 0; 38518334Speter int seq_size = 0; 38618334Speter rtx next = NEXT_INSN (x); 38718334Speter int i; 38818334Speter 38918334Speter /* If we are part of a delay slot sequence, point at the SEQUENCE. */ 39018334Speter if (NEXT_INSN (insn) != x) 39118334Speter { 39218334Speter next = NEXT_INSN (NEXT_INSN (insn)); 39318334Speter sequence = PATTERN (NEXT_INSN (insn)); 39418334Speter seq_size = XVECLEN (sequence, 0); 39518334Speter if (GET_CODE (sequence) != SEQUENCE) 39618334Speter abort (); 39718334Speter } 39818334Speter 39918334Speter res->memory = 1; 40018334Speter SET_HARD_REG_BIT (res->regs, STACK_POINTER_REGNUM); 40118334Speter if (frame_pointer_needed) 40218334Speter { 40318334Speter SET_HARD_REG_BIT (res->regs, FRAME_POINTER_REGNUM); 40418334Speter#if FRAME_POINTER_REGNUM != HARD_FRAME_POINTER_REGNUM 40518334Speter SET_HARD_REG_BIT (res->regs, HARD_FRAME_POINTER_REGNUM); 40618334Speter#endif 40718334Speter } 40818334Speter 40918334Speter for (i = 0; i < FIRST_PSEUDO_REGISTER; i++) 41018334Speter if (global_regs[i]) 41118334Speter SET_HARD_REG_BIT (res->regs, i); 41218334Speter 41318334Speter /* Check for a NOTE_INSN_SETJMP. If it exists, then we must 41418334Speter assume that this call can need any register. 41518334Speter 41618334Speter This is done to be more conservative about how we handle setjmp. 41718334Speter We assume that they both use and set all registers. Using all 41818334Speter registers ensures that a register will not be considered dead 41918334Speter just because it crosses a setjmp call. A register should be 42018334Speter considered dead only if the setjmp call returns non-zero. */ 42118334Speter if (next && GET_CODE (next) == NOTE 42218334Speter && NOTE_LINE_NUMBER (next) == NOTE_INSN_SETJMP) 42318334Speter SET_HARD_REG_SET (res->regs); 42418334Speter 42518334Speter { 42618334Speter rtx link; 42718334Speter 42818334Speter for (link = CALL_INSN_FUNCTION_USAGE (x); 42918334Speter link; 43018334Speter link = XEXP (link, 1)) 43118334Speter if (GET_CODE (XEXP (link, 0)) == USE) 43218334Speter { 43318334Speter for (i = 1; i < seq_size; i++) 43418334Speter { 43518334Speter rtx slot_pat = PATTERN (XVECEXP (sequence, 0, i)); 43618334Speter if (GET_CODE (slot_pat) == SET 43718334Speter && rtx_equal_p (SET_DEST (slot_pat), 43818334Speter SET_DEST (XEXP (link, 0)))) 43918334Speter break; 44018334Speter } 44118334Speter if (i >= seq_size) 44218334Speter mark_referenced_resources (SET_DEST (XEXP (link, 0)), 44318334Speter res, 0); 44418334Speter } 44518334Speter } 44618334Speter } 44718334Speter 44818334Speter /* ... fall through to other INSN processing ... */ 44918334Speter 45018334Speter case INSN: 45118334Speter case JUMP_INSN: 45218334Speter 45318334Speter#ifdef INSN_REFERENCES_ARE_DELAYED 45418334Speter if (! include_delayed_effects 45518334Speter && INSN_REFERENCES_ARE_DELAYED (x)) 45618334Speter return; 45718334Speter#endif 45818334Speter 45918334Speter /* No special processing, just speed up. */ 46018334Speter mark_referenced_resources (PATTERN (x), res, include_delayed_effects); 46118334Speter return; 46218334Speter } 46318334Speter 46418334Speter /* Process each sub-expression and flag what it needs. */ 46518334Speter format_ptr = GET_RTX_FORMAT (code); 46618334Speter for (i = 0; i < GET_RTX_LENGTH (code); i++) 46718334Speter switch (*format_ptr++) 46818334Speter { 46918334Speter case 'e': 47018334Speter mark_referenced_resources (XEXP (x, i), res, include_delayed_effects); 47118334Speter break; 47218334Speter 47318334Speter case 'E': 47418334Speter for (j = 0; j < XVECLEN (x, i); j++) 47518334Speter mark_referenced_resources (XVECEXP (x, i, j), res, 47618334Speter include_delayed_effects); 47718334Speter break; 47818334Speter } 47918334Speter} 48018334Speter 48118334Speter/* Given X, a part of an insn, and a pointer to a `struct resource', RES, 48218334Speter indicate which resources are modified by the insn. If INCLUDE_CALLED_ROUTINE 48318334Speter is nonzero, also mark resources potentially set by the called routine. 48418334Speter 48518334Speter If IN_DEST is nonzero, it means we are inside a SET. Otherwise, 48618334Speter objects are being referenced instead of set. 48718334Speter 48818334Speter We never mark the insn as modifying the condition code unless it explicitly 48918334Speter SETs CC0 even though this is not totally correct. The reason for this is 49018334Speter that we require a SET of CC0 to immediately precede the reference to CC0. 49118334Speter So if some other insn sets CC0 as a side-effect, we know it cannot affect 49218334Speter our computation and thus may be placed in a delay slot. */ 49318334Speter 49418334Speterstatic void 49518334Spetermark_set_resources (x, res, in_dest, include_delayed_effects) 49618334Speter register rtx x; 49718334Speter register struct resources *res; 49818334Speter int in_dest; 49918334Speter int include_delayed_effects; 50018334Speter{ 50118334Speter register enum rtx_code code; 50218334Speter register int i, j; 50318334Speter register char *format_ptr; 50418334Speter 50518334Speter restart: 50618334Speter 50718334Speter code = GET_CODE (x); 50818334Speter 50918334Speter switch (code) 51018334Speter { 51118334Speter case NOTE: 51218334Speter case BARRIER: 51318334Speter case CODE_LABEL: 51418334Speter case USE: 51518334Speter case CONST_INT: 51618334Speter case CONST_DOUBLE: 51718334Speter case LABEL_REF: 51818334Speter case SYMBOL_REF: 51918334Speter case CONST: 52018334Speter case PC: 52118334Speter /* These don't set any resources. */ 52218334Speter return; 52318334Speter 52418334Speter case CC0: 52518334Speter if (in_dest) 52618334Speter res->cc = 1; 52718334Speter return; 52818334Speter 52918334Speter case CALL_INSN: 53018334Speter /* Called routine modifies the condition code, memory, any registers 53118334Speter that aren't saved across calls, global registers and anything 53218334Speter explicitly CLOBBERed immediately after the CALL_INSN. */ 53318334Speter 53418334Speter if (include_delayed_effects) 53518334Speter { 53618334Speter rtx next = NEXT_INSN (x); 53718334Speter rtx prev = PREV_INSN (x); 53818334Speter rtx link; 53918334Speter 54018334Speter res->cc = res->memory = 1; 54118334Speter for (i = 0; i < FIRST_PSEUDO_REGISTER; i++) 54218334Speter if (call_used_regs[i] || global_regs[i]) 54318334Speter SET_HARD_REG_BIT (res->regs, i); 54418334Speter 54518334Speter /* If X is part of a delay slot sequence, then NEXT should be 54618334Speter the first insn after the sequence. */ 54718334Speter if (NEXT_INSN (prev) != x) 54818334Speter next = NEXT_INSN (NEXT_INSN (prev)); 54918334Speter 55018334Speter for (link = CALL_INSN_FUNCTION_USAGE (x); 55118334Speter link; link = XEXP (link, 1)) 55218334Speter if (GET_CODE (XEXP (link, 0)) == CLOBBER) 55318334Speter mark_set_resources (SET_DEST (XEXP (link, 0)), res, 1, 0); 55418334Speter 55518334Speter /* Check for a NOTE_INSN_SETJMP. If it exists, then we must 55618334Speter assume that this call can clobber any register. */ 55718334Speter if (next && GET_CODE (next) == NOTE 55818334Speter && NOTE_LINE_NUMBER (next) == NOTE_INSN_SETJMP) 55918334Speter SET_HARD_REG_SET (res->regs); 56018334Speter } 56118334Speter 56218334Speter /* ... and also what it's RTL says it modifies, if anything. */ 56318334Speter 56418334Speter case JUMP_INSN: 56518334Speter case INSN: 56618334Speter 56718334Speter /* An insn consisting of just a CLOBBER (or USE) is just for flow 56818334Speter and doesn't actually do anything, so we ignore it. */ 56918334Speter 57018334Speter#ifdef INSN_SETS_ARE_DELAYED 57118334Speter if (! include_delayed_effects 57218334Speter && INSN_SETS_ARE_DELAYED (x)) 57318334Speter return; 57418334Speter#endif 57518334Speter 57618334Speter x = PATTERN (x); 57718334Speter if (GET_CODE (x) != USE && GET_CODE (x) != CLOBBER) 57818334Speter goto restart; 57918334Speter return; 58018334Speter 58118334Speter case SET: 58218334Speter /* If the source of a SET is a CALL, this is actually done by 58318334Speter the called routine. So only include it if we are to include the 58418334Speter effects of the calling routine. */ 58518334Speter 58618334Speter mark_set_resources (SET_DEST (x), res, 58718334Speter (include_delayed_effects 58818334Speter || GET_CODE (SET_SRC (x)) != CALL), 58918334Speter 0); 59018334Speter 59118334Speter mark_set_resources (SET_SRC (x), res, 0, 0); 59218334Speter return; 59318334Speter 59418334Speter case CLOBBER: 59518334Speter mark_set_resources (XEXP (x, 0), res, 1, 0); 59618334Speter return; 59718334Speter 59818334Speter case SEQUENCE: 59918334Speter for (i = 0; i < XVECLEN (x, 0); i++) 60018334Speter if (! (INSN_ANNULLED_BRANCH_P (XVECEXP (x, 0, 0)) 60118334Speter && INSN_FROM_TARGET_P (XVECEXP (x, 0, i)))) 60218334Speter mark_set_resources (XVECEXP (x, 0, i), res, 0, 60318334Speter include_delayed_effects); 60418334Speter return; 60518334Speter 60618334Speter case POST_INC: 60718334Speter case PRE_INC: 60818334Speter case POST_DEC: 60918334Speter case PRE_DEC: 61018334Speter mark_set_resources (XEXP (x, 0), res, 1, 0); 61118334Speter return; 61218334Speter 61318334Speter case ZERO_EXTRACT: 61418334Speter mark_set_resources (XEXP (x, 0), res, in_dest, 0); 61518334Speter mark_set_resources (XEXP (x, 1), res, 0, 0); 61618334Speter mark_set_resources (XEXP (x, 2), res, 0, 0); 61718334Speter return; 61818334Speter 61918334Speter case MEM: 62018334Speter if (in_dest) 62118334Speter { 62218334Speter res->memory = 1; 62318334Speter res->unch_memory = RTX_UNCHANGING_P (x); 62418334Speter res->volatil = MEM_VOLATILE_P (x); 62518334Speter } 62618334Speter 62718334Speter mark_set_resources (XEXP (x, 0), res, 0, 0); 62818334Speter return; 62918334Speter 63018334Speter case SUBREG: 63118334Speter if (in_dest) 63218334Speter { 63318334Speter if (GET_CODE (SUBREG_REG (x)) != REG) 63418334Speter mark_set_resources (SUBREG_REG (x), res, 63518334Speter in_dest, include_delayed_effects); 63618334Speter else 63718334Speter { 63818334Speter int regno = REGNO (SUBREG_REG (x)) + SUBREG_WORD (x); 63918334Speter int last_regno = regno + HARD_REGNO_NREGS (regno, GET_MODE (x)); 64018334Speter for (i = regno; i < last_regno; i++) 64118334Speter SET_HARD_REG_BIT (res->regs, i); 64218334Speter } 64318334Speter } 64418334Speter return; 64518334Speter 64618334Speter case REG: 64718334Speter if (in_dest) 64818334Speter for (i = 0; i < HARD_REGNO_NREGS (REGNO (x), GET_MODE (x)); i++) 64918334Speter SET_HARD_REG_BIT (res->regs, REGNO (x) + i); 65018334Speter return; 65118334Speter } 65218334Speter 65318334Speter /* Process each sub-expression and flag what it needs. */ 65418334Speter format_ptr = GET_RTX_FORMAT (code); 65518334Speter for (i = 0; i < GET_RTX_LENGTH (code); i++) 65618334Speter switch (*format_ptr++) 65718334Speter { 65818334Speter case 'e': 65918334Speter mark_set_resources (XEXP (x, i), res, in_dest, include_delayed_effects); 66018334Speter break; 66118334Speter 66218334Speter case 'E': 66318334Speter for (j = 0; j < XVECLEN (x, i); j++) 66418334Speter mark_set_resources (XVECEXP (x, i, j), res, in_dest, 66518334Speter include_delayed_effects); 66618334Speter break; 66718334Speter } 66818334Speter} 66918334Speter 67018334Speter/* Return TRUE if this insn should stop the search for insn to fill delay 67118334Speter slots. LABELS_P indicates that labels should terminate the search. 67218334Speter In all cases, jumps terminate the search. */ 67318334Speter 67418334Speterstatic int 67518334Speterstop_search_p (insn, labels_p) 67618334Speter rtx insn; 67718334Speter int labels_p; 67818334Speter{ 67918334Speter if (insn == 0) 68018334Speter return 1; 68118334Speter 68218334Speter switch (GET_CODE (insn)) 68318334Speter { 68418334Speter case NOTE: 68518334Speter case CALL_INSN: 68618334Speter return 0; 68718334Speter 68818334Speter case CODE_LABEL: 68918334Speter return labels_p; 69018334Speter 69118334Speter case JUMP_INSN: 69218334Speter case BARRIER: 69318334Speter return 1; 69418334Speter 69518334Speter case INSN: 69618334Speter /* OK unless it contains a delay slot or is an `asm' insn of some type. 69718334Speter We don't know anything about these. */ 69818334Speter return (GET_CODE (PATTERN (insn)) == SEQUENCE 69918334Speter || GET_CODE (PATTERN (insn)) == ASM_INPUT 70018334Speter || asm_noperands (PATTERN (insn)) >= 0); 70118334Speter 70218334Speter default: 70318334Speter abort (); 70418334Speter } 70518334Speter} 70618334Speter 70718334Speter/* Return TRUE if any resources are marked in both RES1 and RES2 or if either 70818334Speter resource set contains a volatile memory reference. Otherwise, return FALSE. */ 70918334Speter 71018334Speterstatic int 71118334Speterresource_conflicts_p (res1, res2) 71218334Speter struct resources *res1, *res2; 71318334Speter{ 71418334Speter if ((res1->cc && res2->cc) || (res1->memory && res2->memory) 71518334Speter || (res1->unch_memory && res2->unch_memory) 71618334Speter || res1->volatil || res2->volatil) 71718334Speter return 1; 71818334Speter 71918334Speter#ifdef HARD_REG_SET 72018334Speter return (res1->regs & res2->regs) != HARD_CONST (0); 72118334Speter#else 72218334Speter { 72318334Speter int i; 72418334Speter 72518334Speter for (i = 0; i < HARD_REG_SET_LONGS; i++) 72618334Speter if ((res1->regs[i] & res2->regs[i]) != 0) 72718334Speter return 1; 72818334Speter return 0; 72918334Speter } 73018334Speter#endif 73118334Speter} 73218334Speter 73318334Speter/* Return TRUE if any resource marked in RES, a `struct resources', is 73418334Speter referenced by INSN. If INCLUDE_CALLED_ROUTINE is set, return if the called 73518334Speter routine is using those resources. 73618334Speter 73718334Speter We compute this by computing all the resources referenced by INSN and 73818334Speter seeing if this conflicts with RES. It might be faster to directly check 73918334Speter ourselves, and this is the way it used to work, but it means duplicating 74018334Speter a large block of complex code. */ 74118334Speter 74218334Speterstatic int 74318334Speterinsn_references_resource_p (insn, res, include_delayed_effects) 74418334Speter register rtx insn; 74518334Speter register struct resources *res; 74618334Speter int include_delayed_effects; 74718334Speter{ 74818334Speter struct resources insn_res; 74918334Speter 75018334Speter CLEAR_RESOURCE (&insn_res); 75118334Speter mark_referenced_resources (insn, &insn_res, include_delayed_effects); 75218334Speter return resource_conflicts_p (&insn_res, res); 75318334Speter} 75418334Speter 75518334Speter/* Return TRUE if INSN modifies resources that are marked in RES. 75618334Speter INCLUDE_CALLED_ROUTINE is set if the actions of that routine should be 75718334Speter included. CC0 is only modified if it is explicitly set; see comments 75818334Speter in front of mark_set_resources for details. */ 75918334Speter 76018334Speterstatic int 76118334Speterinsn_sets_resource_p (insn, res, include_delayed_effects) 76218334Speter register rtx insn; 76318334Speter register struct resources *res; 76418334Speter int include_delayed_effects; 76518334Speter{ 76618334Speter struct resources insn_sets; 76718334Speter 76818334Speter CLEAR_RESOURCE (&insn_sets); 76918334Speter mark_set_resources (insn, &insn_sets, 0, include_delayed_effects); 77018334Speter return resource_conflicts_p (&insn_sets, res); 77118334Speter} 77218334Speter 77318334Speter/* Find a label at the end of the function or before a RETURN. If there is 77418334Speter none, make one. */ 77518334Speter 77618334Speterstatic rtx 77718334Speterfind_end_label () 77818334Speter{ 77918334Speter rtx insn; 78018334Speter 78118334Speter /* If we found one previously, return it. */ 78218334Speter if (end_of_function_label) 78318334Speter return end_of_function_label; 78418334Speter 78518334Speter /* Otherwise, see if there is a label at the end of the function. If there 78618334Speter is, it must be that RETURN insns aren't needed, so that is our return 78718334Speter label and we don't have to do anything else. */ 78818334Speter 78918334Speter insn = get_last_insn (); 79018334Speter while (GET_CODE (insn) == NOTE 79118334Speter || (GET_CODE (insn) == INSN 79218334Speter && (GET_CODE (PATTERN (insn)) == USE 79318334Speter || GET_CODE (PATTERN (insn)) == CLOBBER))) 79418334Speter insn = PREV_INSN (insn); 79518334Speter 79618334Speter /* When a target threads its epilogue we might already have a 79718334Speter suitable return insn. If so put a label before it for the 79818334Speter end_of_function_label. */ 79918334Speter if (GET_CODE (insn) == BARRIER 80018334Speter && GET_CODE (PREV_INSN (insn)) == JUMP_INSN 80118334Speter && GET_CODE (PATTERN (PREV_INSN (insn))) == RETURN) 80218334Speter { 80318334Speter rtx temp = PREV_INSN (PREV_INSN (insn)); 80418334Speter end_of_function_label = gen_label_rtx (); 80518334Speter LABEL_NUSES (end_of_function_label) = 0; 80618334Speter 80718334Speter /* Put the label before an USE insns that may proceed the RETURN insn. */ 80818334Speter while (GET_CODE (temp) == USE) 80918334Speter temp = PREV_INSN (temp); 81018334Speter 81118334Speter emit_label_after (end_of_function_label, temp); 81218334Speter } 81318334Speter 81418334Speter else if (GET_CODE (insn) == CODE_LABEL) 81518334Speter end_of_function_label = insn; 81618334Speter else 81718334Speter { 81818334Speter /* Otherwise, make a new label and emit a RETURN and BARRIER, 81918334Speter if needed. */ 82018334Speter end_of_function_label = gen_label_rtx (); 82118334Speter LABEL_NUSES (end_of_function_label) = 0; 82218334Speter emit_label (end_of_function_label); 82318334Speter#ifdef HAVE_return 82418334Speter if (HAVE_return) 82518334Speter { 82618334Speter /* The return we make may have delay slots too. */ 82718334Speter rtx insn = gen_return (); 82818334Speter insn = emit_jump_insn (insn); 82918334Speter emit_barrier (); 83018334Speter if (num_delay_slots (insn) > 0) 83118334Speter obstack_ptr_grow (&unfilled_slots_obstack, insn); 83218334Speter } 83318334Speter#endif 83418334Speter } 83518334Speter 83618334Speter /* Show one additional use for this label so it won't go away until 83718334Speter we are done. */ 83818334Speter ++LABEL_NUSES (end_of_function_label); 83918334Speter 84018334Speter return end_of_function_label; 84118334Speter} 84218334Speter 84318334Speter/* Put INSN and LIST together in a SEQUENCE rtx of LENGTH, and replace 84418334Speter the pattern of INSN with the SEQUENCE. 84518334Speter 84618334Speter Chain the insns so that NEXT_INSN of each insn in the sequence points to 84718334Speter the next and NEXT_INSN of the last insn in the sequence points to 84818334Speter the first insn after the sequence. Similarly for PREV_INSN. This makes 84918334Speter it easier to scan all insns. 85018334Speter 85118334Speter Returns the SEQUENCE that replaces INSN. */ 85218334Speter 85318334Speterstatic rtx 85418334Speteremit_delay_sequence (insn, list, length, avail) 85518334Speter rtx insn; 85618334Speter rtx list; 85718334Speter int length; 85818334Speter int avail; 85918334Speter{ 86018334Speter register int i = 1; 86118334Speter register rtx li; 86218334Speter int had_barrier = 0; 86318334Speter 86418334Speter /* Allocate the the rtvec to hold the insns and the SEQUENCE. */ 86518334Speter rtvec seqv = rtvec_alloc (length + 1); 86618334Speter rtx seq = gen_rtx (SEQUENCE, VOIDmode, seqv); 86718334Speter rtx seq_insn = make_insn_raw (seq); 86818334Speter rtx first = get_insns (); 86918334Speter rtx last = get_last_insn (); 87018334Speter 87118334Speter /* Make a copy of the insn having delay slots. */ 87218334Speter rtx delay_insn = copy_rtx (insn); 87318334Speter 87418334Speter /* If INSN is followed by a BARRIER, delete the BARRIER since it will only 87518334Speter confuse further processing. Update LAST in case it was the last insn. 87618334Speter We will put the BARRIER back in later. */ 87718334Speter if (NEXT_INSN (insn) && GET_CODE (NEXT_INSN (insn)) == BARRIER) 87818334Speter { 87918334Speter delete_insn (NEXT_INSN (insn)); 88018334Speter last = get_last_insn (); 88118334Speter had_barrier = 1; 88218334Speter } 88318334Speter 88418334Speter /* Splice our SEQUENCE into the insn stream where INSN used to be. */ 88518334Speter NEXT_INSN (seq_insn) = NEXT_INSN (insn); 88618334Speter PREV_INSN (seq_insn) = PREV_INSN (insn); 88718334Speter 88818334Speter if (insn == last) 88918334Speter set_new_first_and_last_insn (first, seq_insn); 89018334Speter else 89118334Speter PREV_INSN (NEXT_INSN (seq_insn)) = seq_insn; 89218334Speter 89318334Speter if (insn == first) 89418334Speter set_new_first_and_last_insn (seq_insn, last); 89518334Speter else 89618334Speter NEXT_INSN (PREV_INSN (seq_insn)) = seq_insn; 89718334Speter 89818334Speter /* Build our SEQUENCE and rebuild the insn chain. */ 89918334Speter XVECEXP (seq, 0, 0) = delay_insn; 90018334Speter INSN_DELETED_P (delay_insn) = 0; 90118334Speter PREV_INSN (delay_insn) = PREV_INSN (seq_insn); 90218334Speter 90318334Speter for (li = list; li; li = XEXP (li, 1), i++) 90418334Speter { 90518334Speter rtx tem = XEXP (li, 0); 90618334Speter rtx note; 90718334Speter 90818334Speter /* Show that this copy of the insn isn't deleted. */ 90918334Speter INSN_DELETED_P (tem) = 0; 91018334Speter 91118334Speter XVECEXP (seq, 0, i) = tem; 91218334Speter PREV_INSN (tem) = XVECEXP (seq, 0, i - 1); 91318334Speter NEXT_INSN (XVECEXP (seq, 0, i - 1)) = tem; 91418334Speter 91518334Speter /* Remove any REG_DEAD notes because we can't rely on them now 91618334Speter that the insn has been moved. */ 91718334Speter for (note = REG_NOTES (tem); note; note = XEXP (note, 1)) 91818334Speter if (REG_NOTE_KIND (note) == REG_DEAD) 91918334Speter XEXP (note, 0) = const0_rtx; 92018334Speter } 92118334Speter 92218334Speter NEXT_INSN (XVECEXP (seq, 0, length)) = NEXT_INSN (seq_insn); 92318334Speter 92418334Speter /* If the previous insn is a SEQUENCE, update the NEXT_INSN pointer on the 92518334Speter last insn in that SEQUENCE to point to us. Similarly for the first 92618334Speter insn in the following insn if it is a SEQUENCE. */ 92718334Speter 92818334Speter if (PREV_INSN (seq_insn) && GET_CODE (PREV_INSN (seq_insn)) == INSN 92918334Speter && GET_CODE (PATTERN (PREV_INSN (seq_insn))) == SEQUENCE) 93018334Speter NEXT_INSN (XVECEXP (PATTERN (PREV_INSN (seq_insn)), 0, 93118334Speter XVECLEN (PATTERN (PREV_INSN (seq_insn)), 0) - 1)) 93218334Speter = seq_insn; 93318334Speter 93418334Speter if (NEXT_INSN (seq_insn) && GET_CODE (NEXT_INSN (seq_insn)) == INSN 93518334Speter && GET_CODE (PATTERN (NEXT_INSN (seq_insn))) == SEQUENCE) 93618334Speter PREV_INSN (XVECEXP (PATTERN (NEXT_INSN (seq_insn)), 0, 0)) = seq_insn; 93718334Speter 93818334Speter /* If there used to be a BARRIER, put it back. */ 93918334Speter if (had_barrier) 94018334Speter emit_barrier_after (seq_insn); 94118334Speter 94218334Speter if (i != length + 1) 94318334Speter abort (); 94418334Speter 94518334Speter return seq_insn; 94618334Speter} 94718334Speter 94818334Speter/* Add INSN to DELAY_LIST and return the head of the new list. The list must 94918334Speter be in the order in which the insns are to be executed. */ 95018334Speter 95118334Speterstatic rtx 95218334Speteradd_to_delay_list (insn, delay_list) 95318334Speter rtx insn; 95418334Speter rtx delay_list; 95518334Speter{ 95618334Speter /* If we have an empty list, just make a new list element. If 95718334Speter INSN has it's block number recorded, clear it since we may 95818334Speter be moving the insn to a new block. */ 95918334Speter 96018334Speter if (delay_list == 0) 96118334Speter { 96218334Speter struct target_info *tinfo; 96318334Speter 96418334Speter for (tinfo = target_hash_table[INSN_UID (insn) % TARGET_HASH_PRIME]; 96518334Speter tinfo; tinfo = tinfo->next) 96618334Speter if (tinfo->uid == INSN_UID (insn)) 96718334Speter break; 96818334Speter 96918334Speter if (tinfo) 97018334Speter tinfo->block = -1; 97118334Speter 97218334Speter return gen_rtx (INSN_LIST, VOIDmode, insn, NULL_RTX); 97318334Speter } 97418334Speter 97518334Speter /* Otherwise this must be an INSN_LIST. Add INSN to the end of the 97618334Speter list. */ 97718334Speter XEXP (delay_list, 1) = add_to_delay_list (insn, XEXP (delay_list, 1)); 97818334Speter 97918334Speter return delay_list; 98018334Speter} 98118334Speter 98218334Speter/* Delete INSN from the the delay slot of the insn that it is in. This may 98318334Speter produce an insn without anything in its delay slots. */ 98418334Speter 98518334Speterstatic void 98618334Speterdelete_from_delay_slot (insn) 98718334Speter rtx insn; 98818334Speter{ 98918334Speter rtx trial, seq_insn, seq, prev; 99018334Speter rtx delay_list = 0; 99118334Speter int i; 99218334Speter 99318334Speter /* We first must find the insn containing the SEQUENCE with INSN in its 99418334Speter delay slot. Do this by finding an insn, TRIAL, where 99518334Speter PREV_INSN (NEXT_INSN (TRIAL)) != TRIAL. */ 99618334Speter 99718334Speter for (trial = insn; 99818334Speter PREV_INSN (NEXT_INSN (trial)) == trial; 99918334Speter trial = NEXT_INSN (trial)) 100018334Speter ; 100118334Speter 100218334Speter seq_insn = PREV_INSN (NEXT_INSN (trial)); 100318334Speter seq = PATTERN (seq_insn); 100418334Speter 100518334Speter /* Create a delay list consisting of all the insns other than the one 100618334Speter we are deleting (unless we were the only one). */ 100718334Speter if (XVECLEN (seq, 0) > 2) 100818334Speter for (i = 1; i < XVECLEN (seq, 0); i++) 100918334Speter if (XVECEXP (seq, 0, i) != insn) 101018334Speter delay_list = add_to_delay_list (XVECEXP (seq, 0, i), delay_list); 101118334Speter 101218334Speter /* Delete the old SEQUENCE, re-emit the insn that used to have the delay 101318334Speter list, and rebuild the delay list if non-empty. */ 101418334Speter prev = PREV_INSN (seq_insn); 101518334Speter trial = XVECEXP (seq, 0, 0); 101618334Speter delete_insn (seq_insn); 101718334Speter add_insn_after (trial, prev); 101818334Speter 101918334Speter if (GET_CODE (trial) == JUMP_INSN 102018334Speter && (simplejump_p (trial) || GET_CODE (PATTERN (trial)) == RETURN)) 102118334Speter emit_barrier_after (trial); 102218334Speter 102318334Speter /* If there are any delay insns, remit them. Otherwise clear the 102418334Speter annul flag. */ 102518334Speter if (delay_list) 102618334Speter trial = emit_delay_sequence (trial, delay_list, XVECLEN (seq, 0) - 2, 0); 102718334Speter else 102818334Speter INSN_ANNULLED_BRANCH_P (trial) = 0; 102918334Speter 103018334Speter INSN_FROM_TARGET_P (insn) = 0; 103118334Speter 103218334Speter /* Show we need to fill this insn again. */ 103318334Speter obstack_ptr_grow (&unfilled_slots_obstack, trial); 103418334Speter} 103518334Speter 103618334Speter/* Delete INSN, a JUMP_INSN. If it is a conditional jump, we must track down 103718334Speter the insn that sets CC0 for it and delete it too. */ 103818334Speter 103918334Speterstatic void 104018334Speterdelete_scheduled_jump (insn) 104118334Speter rtx insn; 104218334Speter{ 104318334Speter /* Delete the insn that sets cc0 for us. On machines without cc0, we could 104418334Speter delete the insn that sets the condition code, but it is hard to find it. 104518334Speter Since this case is rare anyway, don't bother trying; there would likely 104618334Speter be other insns that became dead anyway, which we wouldn't know to 104718334Speter delete. */ 104818334Speter 104918334Speter#ifdef HAVE_cc0 105018334Speter if (reg_mentioned_p (cc0_rtx, insn)) 105118334Speter { 105218334Speter rtx note = find_reg_note (insn, REG_CC_SETTER, NULL_RTX); 105318334Speter 105418334Speter /* If a reg-note was found, it points to an insn to set CC0. This 105518334Speter insn is in the delay list of some other insn. So delete it from 105618334Speter the delay list it was in. */ 105718334Speter if (note) 105818334Speter { 105918334Speter if (! FIND_REG_INC_NOTE (XEXP (note, 0), NULL_RTX) 106018334Speter && sets_cc0_p (PATTERN (XEXP (note, 0))) == 1) 106118334Speter delete_from_delay_slot (XEXP (note, 0)); 106218334Speter } 106318334Speter else 106418334Speter { 106518334Speter /* The insn setting CC0 is our previous insn, but it may be in 106618334Speter a delay slot. It will be the last insn in the delay slot, if 106718334Speter it is. */ 106818334Speter rtx trial = previous_insn (insn); 106918334Speter if (GET_CODE (trial) == NOTE) 107018334Speter trial = prev_nonnote_insn (trial); 107118334Speter if (sets_cc0_p (PATTERN (trial)) != 1 107218334Speter || FIND_REG_INC_NOTE (trial, 0)) 107318334Speter return; 107418334Speter if (PREV_INSN (NEXT_INSN (trial)) == trial) 107518334Speter delete_insn (trial); 107618334Speter else 107718334Speter delete_from_delay_slot (trial); 107818334Speter } 107918334Speter } 108018334Speter#endif 108118334Speter 108218334Speter delete_insn (insn); 108318334Speter} 108418334Speter 108518334Speter/* Counters for delay-slot filling. */ 108618334Speter 108718334Speter#define NUM_REORG_FUNCTIONS 2 108818334Speter#define MAX_DELAY_HISTOGRAM 3 108918334Speter#define MAX_REORG_PASSES 2 109018334Speter 109118334Speterstatic int num_insns_needing_delays[NUM_REORG_FUNCTIONS][MAX_REORG_PASSES]; 109218334Speter 109318334Speterstatic int num_filled_delays[NUM_REORG_FUNCTIONS][MAX_DELAY_HISTOGRAM+1][MAX_REORG_PASSES]; 109418334Speter 109518334Speterstatic int reorg_pass_number; 109618334Speter 109718334Speterstatic void 109818334Speternote_delay_statistics (slots_filled, index) 109918334Speter int slots_filled, index; 110018334Speter{ 110118334Speter num_insns_needing_delays[index][reorg_pass_number]++; 110218334Speter if (slots_filled > MAX_DELAY_HISTOGRAM) 110318334Speter slots_filled = MAX_DELAY_HISTOGRAM; 110418334Speter num_filled_delays[index][slots_filled][reorg_pass_number]++; 110518334Speter} 110618334Speter 110718334Speter#if defined(ANNUL_IFFALSE_SLOTS) || defined(ANNUL_IFTRUE_SLOTS) 110818334Speter 110918334Speter/* Optimize the following cases: 111018334Speter 111118334Speter 1. When a conditional branch skips over only one instruction, 111218334Speter use an annulling branch and put that insn in the delay slot. 111318334Speter Use either a branch that annuls when the condition if true or 111418334Speter invert the test with a branch that annuls when the condition is 111518334Speter false. This saves insns, since otherwise we must copy an insn 111618334Speter from the L1 target. 111718334Speter 111818334Speter (orig) (skip) (otherwise) 111918334Speter Bcc.n L1 Bcc',a L1 Bcc,a L1' 112018334Speter insn insn insn2 112118334Speter L1: L1: L1: 112218334Speter insn2 insn2 insn2 112318334Speter insn3 insn3 L1': 112418334Speter insn3 112518334Speter 112618334Speter 2. When a conditional branch skips over only one instruction, 112718334Speter and after that, it unconditionally branches somewhere else, 112818334Speter perform the similar optimization. This saves executing the 112918334Speter second branch in the case where the inverted condition is true. 113018334Speter 113118334Speter Bcc.n L1 Bcc',a L2 113218334Speter insn insn 113318334Speter L1: L1: 113418334Speter Bra L2 Bra L2 113518334Speter 113618334Speter INSN is a JUMP_INSN. 113718334Speter 113818334Speter This should be expanded to skip over N insns, where N is the number 113918334Speter of delay slots required. */ 114018334Speter 114118334Speterstatic rtx 114218334Speteroptimize_skip (insn) 114318334Speter register rtx insn; 114418334Speter{ 114518334Speter register rtx trial = next_nonnote_insn (insn); 114618334Speter rtx next_trial = next_active_insn (trial); 114718334Speter rtx delay_list = 0; 114818334Speter rtx target_label; 114918334Speter int flags; 115018334Speter 115118334Speter flags = get_jump_flags (insn, JUMP_LABEL (insn)); 115218334Speter 115318334Speter if (trial == 0 115418334Speter || GET_CODE (trial) != INSN 115518334Speter || GET_CODE (PATTERN (trial)) == SEQUENCE 115618334Speter || recog_memoized (trial) < 0 115718334Speter || (! eligible_for_annul_false (insn, 0, trial, flags) 115818334Speter && ! eligible_for_annul_true (insn, 0, trial, flags))) 115918334Speter return 0; 116018334Speter 116118334Speter /* There are two cases where we are just executing one insn (we assume 116218334Speter here that a branch requires only one insn; this should be generalized 116318334Speter at some point): Where the branch goes around a single insn or where 116418334Speter we have one insn followed by a branch to the same label we branch to. 116518334Speter In both of these cases, inverting the jump and annulling the delay 116618334Speter slot give the same effect in fewer insns. */ 116718334Speter if ((next_trial == next_active_insn (JUMP_LABEL (insn))) 116818334Speter || (next_trial != 0 116918334Speter && GET_CODE (next_trial) == JUMP_INSN 117018334Speter && JUMP_LABEL (insn) == JUMP_LABEL (next_trial) 117118334Speter && (simplejump_p (next_trial) 117218334Speter || GET_CODE (PATTERN (next_trial)) == RETURN))) 117318334Speter { 117418334Speter if (eligible_for_annul_false (insn, 0, trial, flags)) 117518334Speter { 117618334Speter if (invert_jump (insn, JUMP_LABEL (insn))) 117718334Speter INSN_FROM_TARGET_P (trial) = 1; 117818334Speter else if (! eligible_for_annul_true (insn, 0, trial, flags)) 117918334Speter return 0; 118018334Speter } 118118334Speter 118218334Speter delay_list = add_to_delay_list (trial, NULL_RTX); 118318334Speter next_trial = next_active_insn (trial); 118418334Speter update_block (trial, trial); 118518334Speter delete_insn (trial); 118618334Speter 118718334Speter /* Also, if we are targeting an unconditional 118818334Speter branch, thread our jump to the target of that branch. Don't 118918334Speter change this into a RETURN here, because it may not accept what 119018334Speter we have in the delay slot. We'll fix this up later. */ 119118334Speter if (next_trial && GET_CODE (next_trial) == JUMP_INSN 119218334Speter && (simplejump_p (next_trial) 119318334Speter || GET_CODE (PATTERN (next_trial)) == RETURN)) 119418334Speter { 119518334Speter target_label = JUMP_LABEL (next_trial); 119618334Speter if (target_label == 0) 119718334Speter target_label = find_end_label (); 119818334Speter 119918334Speter /* Recompute the flags based on TARGET_LABEL since threading 120018334Speter the jump to TARGET_LABEL may change the direction of the 120118334Speter jump (which may change the circumstances in which the 120218334Speter delay slot is nullified). */ 120318334Speter flags = get_jump_flags (insn, target_label); 120418334Speter if (eligible_for_annul_true (insn, 0, trial, flags)) 120518334Speter reorg_redirect_jump (insn, target_label); 120618334Speter } 120718334Speter 120818334Speter INSN_ANNULLED_BRANCH_P (insn) = 1; 120918334Speter } 121018334Speter 121118334Speter return delay_list; 121218334Speter} 121318334Speter#endif 121418334Speter 121518334Speter 121618334Speter/* Encode and return branch direction and prediction information for 121718334Speter INSN assuming it will jump to LABEL. 121818334Speter 121918334Speter Non conditional branches return no direction information and 122018334Speter are predicted as very likely taken. */ 122118334Speterstatic int 122218334Speterget_jump_flags (insn, label) 122318334Speter rtx insn, label; 122418334Speter{ 122518334Speter int flags; 122618334Speter 122718334Speter /* get_jump_flags can be passed any insn with delay slots, these may 122818334Speter be INSNs, CALL_INSNs, or JUMP_INSNs. Only JUMP_INSNs have branch 122918334Speter direction information, and only if they are conditional jumps. 123018334Speter 123118334Speter If LABEL is zero, then there is no way to determine the branch 123218334Speter direction. */ 123318334Speter if (GET_CODE (insn) == JUMP_INSN 123418334Speter && (condjump_p (insn) || condjump_in_parallel_p (insn)) 123518334Speter && INSN_UID (insn) <= max_uid 123618334Speter && label != 0 123718334Speter && INSN_UID (label) <= max_uid) 123818334Speter flags 123918334Speter = (uid_to_ruid[INSN_UID (label)] > uid_to_ruid[INSN_UID (insn)]) 124018334Speter ? ATTR_FLAG_forward : ATTR_FLAG_backward; 124118334Speter /* No valid direction information. */ 124218334Speter else 124318334Speter flags = 0; 124418334Speter 124518334Speter /* If insn is a conditional branch call mostly_true_jump to get 124618334Speter determine the branch prediction. 124718334Speter 124818334Speter Non conditional branches are predicted as very likely taken. */ 124918334Speter if (GET_CODE (insn) == JUMP_INSN 125018334Speter && (condjump_p (insn) || condjump_in_parallel_p (insn))) 125118334Speter { 125218334Speter int prediction; 125318334Speter 125418334Speter prediction = mostly_true_jump (insn, get_branch_condition (insn, label)); 125518334Speter switch (prediction) 125618334Speter { 125718334Speter case 2: 125818334Speter flags |= (ATTR_FLAG_very_likely | ATTR_FLAG_likely); 125918334Speter break; 126018334Speter case 1: 126118334Speter flags |= ATTR_FLAG_likely; 126218334Speter break; 126318334Speter case 0: 126418334Speter flags |= ATTR_FLAG_unlikely; 126518334Speter break; 126618334Speter case -1: 126718334Speter flags |= (ATTR_FLAG_very_unlikely | ATTR_FLAG_unlikely); 126818334Speter break; 126918334Speter 127018334Speter default: 127118334Speter abort(); 127218334Speter } 127318334Speter } 127418334Speter else 127518334Speter flags |= (ATTR_FLAG_very_likely | ATTR_FLAG_likely); 127618334Speter 127718334Speter return flags; 127818334Speter} 127918334Speter 128018334Speter/* Return 1 if INSN is a destination that will be branched to rarely (the 128118334Speter return point of a function); return 2 if DEST will be branched to very 128218334Speter rarely (a call to a function that doesn't return). Otherwise, 128318334Speter return 0. */ 128418334Speter 128518334Speterstatic int 128618334Speterrare_destination (insn) 128718334Speter rtx insn; 128818334Speter{ 128918334Speter int jump_count = 0; 129018334Speter rtx next; 129118334Speter 129218334Speter for (; insn; insn = next) 129318334Speter { 129418334Speter if (GET_CODE (insn) == INSN && GET_CODE (PATTERN (insn)) == SEQUENCE) 129518334Speter insn = XVECEXP (PATTERN (insn), 0, 0); 129618334Speter 129718334Speter next = NEXT_INSN (insn); 129818334Speter 129918334Speter switch (GET_CODE (insn)) 130018334Speter { 130118334Speter case CODE_LABEL: 130218334Speter return 0; 130318334Speter case BARRIER: 130418334Speter /* A BARRIER can either be after a JUMP_INSN or a CALL_INSN. We 130518334Speter don't scan past JUMP_INSNs, so any barrier we find here must 130618334Speter have been after a CALL_INSN and hence mean the call doesn't 130718334Speter return. */ 130818334Speter return 2; 130918334Speter case JUMP_INSN: 131018334Speter if (GET_CODE (PATTERN (insn)) == RETURN) 131118334Speter return 1; 131218334Speter else if (simplejump_p (insn) 131318334Speter && jump_count++ < 10) 131418334Speter next = JUMP_LABEL (insn); 131518334Speter else 131618334Speter return 0; 131718334Speter } 131818334Speter } 131918334Speter 132018334Speter /* If we got here it means we hit the end of the function. So this 132118334Speter is an unlikely destination. */ 132218334Speter 132318334Speter return 1; 132418334Speter} 132518334Speter 132618334Speter/* Return truth value of the statement that this branch 132718334Speter is mostly taken. If we think that the branch is extremely likely 132818334Speter to be taken, we return 2. If the branch is slightly more likely to be 132918334Speter taken, return 1. If the branch is slightly less likely to be taken, 133018334Speter return 0 and if the branch is highly unlikely to be taken, return -1. 133118334Speter 133218334Speter CONDITION, if non-zero, is the condition that JUMP_INSN is testing. */ 133318334Speter 133418334Speterstatic int 133518334Spetermostly_true_jump (jump_insn, condition) 133618334Speter rtx jump_insn, condition; 133718334Speter{ 133818334Speter rtx target_label = JUMP_LABEL (jump_insn); 133918334Speter rtx insn; 134018334Speter int rare_dest = rare_destination (target_label); 134118334Speter int rare_fallthrough = rare_destination (NEXT_INSN (jump_insn)); 134218334Speter 134318334Speter /* If this is a branch outside a loop, it is highly unlikely. */ 134418334Speter if (GET_CODE (PATTERN (jump_insn)) == SET 134518334Speter && GET_CODE (SET_SRC (PATTERN (jump_insn))) == IF_THEN_ELSE 134618334Speter && ((GET_CODE (XEXP (SET_SRC (PATTERN (jump_insn)), 1)) == LABEL_REF 134718334Speter && LABEL_OUTSIDE_LOOP_P (XEXP (SET_SRC (PATTERN (jump_insn)), 1))) 134818334Speter || (GET_CODE (XEXP (SET_SRC (PATTERN (jump_insn)), 2)) == LABEL_REF 134918334Speter && LABEL_OUTSIDE_LOOP_P (XEXP (SET_SRC (PATTERN (jump_insn)), 2))))) 135018334Speter return -1; 135118334Speter 135218334Speter if (target_label) 135318334Speter { 135418334Speter /* If this is the test of a loop, it is very likely true. We scan 135518334Speter backwards from the target label. If we find a NOTE_INSN_LOOP_BEG 135618334Speter before the next real insn, we assume the branch is to the top of 135718334Speter the loop. */ 135818334Speter for (insn = PREV_INSN (target_label); 135918334Speter insn && GET_CODE (insn) == NOTE; 136018334Speter insn = PREV_INSN (insn)) 136118334Speter if (NOTE_LINE_NUMBER (insn) == NOTE_INSN_LOOP_BEG) 136218334Speter return 2; 136318334Speter 136418334Speter /* If this is a jump to the test of a loop, it is likely true. We scan 136518334Speter forwards from the target label. If we find a NOTE_INSN_LOOP_VTOP 136618334Speter before the next real insn, we assume the branch is to the loop branch 136718334Speter test. */ 136818334Speter for (insn = NEXT_INSN (target_label); 136918334Speter insn && GET_CODE (insn) == NOTE; 137018334Speter insn = PREV_INSN (insn)) 137118334Speter if (NOTE_LINE_NUMBER (insn) == NOTE_INSN_LOOP_VTOP) 137218334Speter return 1; 137318334Speter } 137418334Speter 137518334Speter /* Look at the relative rarities of the fallthrough and destination. If 137618334Speter they differ, we can predict the branch that way. */ 137718334Speter 137818334Speter switch (rare_fallthrough - rare_dest) 137918334Speter { 138018334Speter case -2: 138118334Speter return -1; 138218334Speter case -1: 138318334Speter return 0; 138418334Speter case 0: 138518334Speter break; 138618334Speter case 1: 138718334Speter return 1; 138818334Speter case 2: 138918334Speter return 2; 139018334Speter } 139118334Speter 139218334Speter /* If we couldn't figure out what this jump was, assume it won't be 139318334Speter taken. This should be rare. */ 139418334Speter if (condition == 0) 139518334Speter return 0; 139618334Speter 139718334Speter /* EQ tests are usually false and NE tests are usually true. Also, 139818334Speter most quantities are positive, so we can make the appropriate guesses 139918334Speter about signed comparisons against zero. */ 140018334Speter switch (GET_CODE (condition)) 140118334Speter { 140218334Speter case CONST_INT: 140318334Speter /* Unconditional branch. */ 140418334Speter return 1; 140518334Speter case EQ: 140618334Speter return 0; 140718334Speter case NE: 140818334Speter return 1; 140918334Speter case LE: 141018334Speter case LT: 141118334Speter if (XEXP (condition, 1) == const0_rtx) 141218334Speter return 0; 141318334Speter break; 141418334Speter case GE: 141518334Speter case GT: 141618334Speter if (XEXP (condition, 1) == const0_rtx) 141718334Speter return 1; 141818334Speter break; 141918334Speter } 142018334Speter 142118334Speter /* Predict backward branches usually take, forward branches usually not. If 142218334Speter we don't know whether this is forward or backward, assume the branch 142318334Speter will be taken, since most are. */ 142418334Speter return (target_label == 0 || INSN_UID (jump_insn) > max_uid 142518334Speter || INSN_UID (target_label) > max_uid 142618334Speter || (uid_to_ruid[INSN_UID (jump_insn)] 142718334Speter > uid_to_ruid[INSN_UID (target_label)]));; 142818334Speter} 142918334Speter 143018334Speter/* Return the condition under which INSN will branch to TARGET. If TARGET 143118334Speter is zero, return the condition under which INSN will return. If INSN is 143218334Speter an unconditional branch, return const_true_rtx. If INSN isn't a simple 143318334Speter type of jump, or it doesn't go to TARGET, return 0. */ 143418334Speter 143518334Speterstatic rtx 143618334Speterget_branch_condition (insn, target) 143718334Speter rtx insn; 143818334Speter rtx target; 143918334Speter{ 144018334Speter rtx pat = PATTERN (insn); 144118334Speter rtx src; 144218334Speter 144318334Speter if (condjump_in_parallel_p (insn)) 144418334Speter pat = XVECEXP (pat, 0, 0); 144518334Speter 144618334Speter if (GET_CODE (pat) == RETURN) 144718334Speter return target == 0 ? const_true_rtx : 0; 144818334Speter 144918334Speter else if (GET_CODE (pat) != SET || SET_DEST (pat) != pc_rtx) 145018334Speter return 0; 145118334Speter 145218334Speter src = SET_SRC (pat); 145318334Speter if (GET_CODE (src) == LABEL_REF && XEXP (src, 0) == target) 145418334Speter return const_true_rtx; 145518334Speter 145618334Speter else if (GET_CODE (src) == IF_THEN_ELSE 145718334Speter && ((target == 0 && GET_CODE (XEXP (src, 1)) == RETURN) 145818334Speter || (GET_CODE (XEXP (src, 1)) == LABEL_REF 145918334Speter && XEXP (XEXP (src, 1), 0) == target)) 146018334Speter && XEXP (src, 2) == pc_rtx) 146118334Speter return XEXP (src, 0); 146218334Speter 146318334Speter else if (GET_CODE (src) == IF_THEN_ELSE 146418334Speter && ((target == 0 && GET_CODE (XEXP (src, 2)) == RETURN) 146518334Speter || (GET_CODE (XEXP (src, 2)) == LABEL_REF 146618334Speter && XEXP (XEXP (src, 2), 0) == target)) 146718334Speter && XEXP (src, 1) == pc_rtx) 146818334Speter return gen_rtx (reverse_condition (GET_CODE (XEXP (src, 0))), 146918334Speter GET_MODE (XEXP (src, 0)), 147018334Speter XEXP (XEXP (src, 0), 0), XEXP (XEXP (src, 0), 1)); 147118334Speter 147218334Speter return 0; 147318334Speter} 147418334Speter 147518334Speter/* Return non-zero if CONDITION is more strict than the condition of 147618334Speter INSN, i.e., if INSN will always branch if CONDITION is true. */ 147718334Speter 147818334Speterstatic int 147918334Spetercondition_dominates_p (condition, insn) 148018334Speter rtx condition; 148118334Speter rtx insn; 148218334Speter{ 148318334Speter rtx other_condition = get_branch_condition (insn, JUMP_LABEL (insn)); 148418334Speter enum rtx_code code = GET_CODE (condition); 148518334Speter enum rtx_code other_code; 148618334Speter 148718334Speter if (rtx_equal_p (condition, other_condition) 148818334Speter || other_condition == const_true_rtx) 148918334Speter return 1; 149018334Speter 149118334Speter else if (condition == const_true_rtx || other_condition == 0) 149218334Speter return 0; 149318334Speter 149418334Speter other_code = GET_CODE (other_condition); 149518334Speter if (GET_RTX_LENGTH (code) != 2 || GET_RTX_LENGTH (other_code) != 2 149618334Speter || ! rtx_equal_p (XEXP (condition, 0), XEXP (other_condition, 0)) 149718334Speter || ! rtx_equal_p (XEXP (condition, 1), XEXP (other_condition, 1))) 149818334Speter return 0; 149918334Speter 150018334Speter return comparison_dominates_p (code, other_code); 150118334Speter} 150218334Speter 150318334Speter/* Return non-zero if redirecting JUMP to NEWLABEL does not invalidate 150418334Speter any insns already in the delay slot of JUMP. */ 150518334Speter 150618334Speterstatic int 150718334Speterredirect_with_delay_slots_safe_p (jump, newlabel, seq) 150818334Speter rtx jump, newlabel, seq; 150918334Speter{ 151018334Speter int flags, slots, i; 151118334Speter rtx pat = PATTERN (seq); 151218334Speter 151318334Speter /* Make sure all the delay slots of this jump would still 151418334Speter be valid after threading the jump. If they are still 151518334Speter valid, then return non-zero. */ 151618334Speter 151718334Speter flags = get_jump_flags (jump, newlabel); 151818334Speter for (i = 1; i < XVECLEN (pat, 0); i++) 151918334Speter if (! ( 152018334Speter#ifdef ANNUL_IFFALSE_SLOTS 152118334Speter (INSN_ANNULLED_BRANCH_P (jump) 152218334Speter && INSN_FROM_TARGET_P (XVECEXP (pat, 0, i))) 152318334Speter ? eligible_for_annul_false (jump, i - 1, 152418334Speter XVECEXP (pat, 0, i), flags) : 152518334Speter#endif 152618334Speter#ifdef ANNUL_IFTRUE_SLOTS 152718334Speter (INSN_ANNULLED_BRANCH_P (jump) 152818334Speter && ! INSN_FROM_TARGET_P (XVECEXP (pat, 0, i))) 152918334Speter ? eligible_for_annul_true (jump, i - 1, 153018334Speter XVECEXP (pat, 0, i), flags) : 153118334Speter#endif 153218334Speter eligible_for_delay (jump, i -1, XVECEXP (pat, 0, i), flags))) 153318334Speter break; 153418334Speter 153518334Speter return (i == XVECLEN (pat, 0)); 153618334Speter} 153718334Speter 153818334Speter/* Return non-zero if redirecting JUMP to NEWLABEL does not invalidate 153918334Speter any insns we wish to place in the delay slot of JUMP. */ 154018334Speter 154118334Speterstatic int 154218334Speterredirect_with_delay_list_safe_p (jump, newlabel, delay_list) 154318334Speter rtx jump, newlabel, delay_list; 154418334Speter{ 154518334Speter int flags, i; 154618334Speter rtx li; 154718334Speter 154818334Speter /* Make sure all the insns in DELAY_LIST would still be 154918334Speter valid after threading the jump. If they are still 155018334Speter valid, then return non-zero. */ 155118334Speter 155218334Speter flags = get_jump_flags (jump, newlabel); 155318334Speter for (li = delay_list, i = 0; li; li = XEXP (li, 1), i++) 155418334Speter if (! ( 155518334Speter#ifdef ANNUL_IFFALSE_SLOTS 155618334Speter (INSN_ANNULLED_BRANCH_P (jump) 155718334Speter && INSN_FROM_TARGET_P (XEXP (li, 0))) 155818334Speter ? eligible_for_annul_false (jump, i, XEXP (li, 0), flags) : 155918334Speter#endif 156018334Speter#ifdef ANNUL_IFTRUE_SLOTS 156118334Speter (INSN_ANNULLED_BRANCH_P (jump) 156218334Speter && ! INSN_FROM_TARGET_P (XEXP (li, 0))) 156318334Speter ? eligible_for_annul_true (jump, i, XEXP (li, 0), flags) : 156418334Speter#endif 156518334Speter eligible_for_delay (jump, i, XEXP (li, 0), flags))) 156618334Speter break; 156718334Speter 156818334Speter return (li == NULL); 156918334Speter} 157018334Speter 157118334Speter 157218334Speter/* INSN branches to an insn whose pattern SEQ is a SEQUENCE. Given that 157318334Speter the condition tested by INSN is CONDITION and the resources shown in 157418334Speter OTHER_NEEDED are needed after INSN, see whether INSN can take all the insns 157518334Speter from SEQ's delay list, in addition to whatever insns it may execute 157618334Speter (in DELAY_LIST). SETS and NEEDED are denote resources already set and 157718334Speter needed while searching for delay slot insns. Return the concatenated 157818334Speter delay list if possible, otherwise, return 0. 157918334Speter 158018334Speter SLOTS_TO_FILL is the total number of slots required by INSN, and 158118334Speter PSLOTS_FILLED points to the number filled so far (also the number of 158218334Speter insns in DELAY_LIST). It is updated with the number that have been 158318334Speter filled from the SEQUENCE, if any. 158418334Speter 158518334Speter PANNUL_P points to a non-zero value if we already know that we need 158618334Speter to annul INSN. If this routine determines that annulling is needed, 158718334Speter it may set that value non-zero. 158818334Speter 158918334Speter PNEW_THREAD points to a location that is to receive the place at which 159018334Speter execution should continue. */ 159118334Speter 159218334Speterstatic rtx 159318334Spetersteal_delay_list_from_target (insn, condition, seq, delay_list, 159418334Speter sets, needed, other_needed, 159518334Speter slots_to_fill, pslots_filled, pannul_p, 159618334Speter pnew_thread) 159718334Speter rtx insn, condition; 159818334Speter rtx seq; 159918334Speter rtx delay_list; 160018334Speter struct resources *sets, *needed, *other_needed; 160118334Speter int slots_to_fill; 160218334Speter int *pslots_filled; 160318334Speter int *pannul_p; 160418334Speter rtx *pnew_thread; 160518334Speter{ 160618334Speter rtx temp; 160718334Speter int slots_remaining = slots_to_fill - *pslots_filled; 160818334Speter int total_slots_filled = *pslots_filled; 160918334Speter rtx new_delay_list = 0; 161018334Speter int must_annul = *pannul_p; 161118334Speter int i; 161218334Speter 161318334Speter /* We can't do anything if there are more delay slots in SEQ than we 161418334Speter can handle, or if we don't know that it will be a taken branch. 161518334Speter We know that it will be a taken branch if it is either an unconditional 161618334Speter branch or a conditional branch with a stricter branch condition. 161718334Speter 161818334Speter Also, exit if the branch has more than one set, since then it is computing 161918334Speter other results that can't be ignored, e.g. the HPPA mov&branch instruction. 162018334Speter ??? It may be possible to move other sets into INSN in addition to 162118334Speter moving the instructions in the delay slots. */ 162218334Speter 162318334Speter if (XVECLEN (seq, 0) - 1 > slots_remaining 162418334Speter || ! condition_dominates_p (condition, XVECEXP (seq, 0, 0)) 162518334Speter || ! single_set (XVECEXP (seq, 0, 0))) 162618334Speter return delay_list; 162718334Speter 162818334Speter for (i = 1; i < XVECLEN (seq, 0); i++) 162918334Speter { 163018334Speter rtx trial = XVECEXP (seq, 0, i); 163118334Speter int flags; 163218334Speter 163318334Speter if (insn_references_resource_p (trial, sets, 0) 163418334Speter || insn_sets_resource_p (trial, needed, 0) 163518334Speter || insn_sets_resource_p (trial, sets, 0) 163618334Speter#ifdef HAVE_cc0 163718334Speter /* If TRIAL sets CC0, we can't copy it, so we can't steal this 163818334Speter delay list. */ 163918334Speter || find_reg_note (trial, REG_CC_USER, NULL_RTX) 164018334Speter#endif 164118334Speter /* If TRIAL is from the fallthrough code of an annulled branch insn 164218334Speter in SEQ, we cannot use it. */ 164318334Speter || (INSN_ANNULLED_BRANCH_P (XVECEXP (seq, 0, 0)) 164418334Speter && ! INSN_FROM_TARGET_P (trial))) 164518334Speter return delay_list; 164618334Speter 164718334Speter /* If this insn was already done (usually in a previous delay slot), 164818334Speter pretend we put it in our delay slot. */ 164918334Speter if (redundant_insn (trial, insn, new_delay_list)) 165018334Speter continue; 165118334Speter 165218334Speter /* We will end up re-vectoring this branch, so compute flags 165318334Speter based on jumping to the new label. */ 165418334Speter flags = get_jump_flags (insn, JUMP_LABEL (XVECEXP (seq, 0, 0))); 165518334Speter 165618334Speter if (! must_annul 165718334Speter && ((condition == const_true_rtx 165818334Speter || (! insn_sets_resource_p (trial, other_needed, 0) 165918334Speter && ! may_trap_p (PATTERN (trial))))) 166018334Speter ? eligible_for_delay (insn, total_slots_filled, trial, flags) 166118334Speter : (must_annul = 1, 166218334Speter eligible_for_annul_false (insn, total_slots_filled, trial, flags))) 166318334Speter { 166418334Speter temp = copy_rtx (trial); 166518334Speter INSN_FROM_TARGET_P (temp) = 1; 166618334Speter new_delay_list = add_to_delay_list (temp, new_delay_list); 166718334Speter total_slots_filled++; 166818334Speter 166918334Speter if (--slots_remaining == 0) 167018334Speter break; 167118334Speter } 167218334Speter else 167318334Speter return delay_list; 167418334Speter } 167518334Speter 167618334Speter /* Show the place to which we will be branching. */ 167718334Speter *pnew_thread = next_active_insn (JUMP_LABEL (XVECEXP (seq, 0, 0))); 167818334Speter 167918334Speter /* Add any new insns to the delay list and update the count of the 168018334Speter number of slots filled. */ 168118334Speter *pslots_filled = total_slots_filled; 168218334Speter *pannul_p = must_annul; 168318334Speter 168418334Speter if (delay_list == 0) 168518334Speter return new_delay_list; 168618334Speter 168718334Speter for (temp = new_delay_list; temp; temp = XEXP (temp, 1)) 168818334Speter delay_list = add_to_delay_list (XEXP (temp, 0), delay_list); 168918334Speter 169018334Speter return delay_list; 169118334Speter} 169218334Speter 169318334Speter/* Similar to steal_delay_list_from_target except that SEQ is on the 169418334Speter fallthrough path of INSN. Here we only do something if the delay insn 169518334Speter of SEQ is an unconditional branch. In that case we steal its delay slot 169618334Speter for INSN since unconditional branches are much easier to fill. */ 169718334Speter 169818334Speterstatic rtx 169918334Spetersteal_delay_list_from_fallthrough (insn, condition, seq, 170018334Speter delay_list, sets, needed, other_needed, 170118334Speter slots_to_fill, pslots_filled, pannul_p) 170218334Speter rtx insn, condition; 170318334Speter rtx seq; 170418334Speter rtx delay_list; 170518334Speter struct resources *sets, *needed, *other_needed; 170618334Speter int slots_to_fill; 170718334Speter int *pslots_filled; 170818334Speter int *pannul_p; 170918334Speter{ 171018334Speter int i; 171118334Speter int flags; 171218334Speter 171318334Speter flags = get_jump_flags (insn, JUMP_LABEL (insn)); 171418334Speter 171518334Speter /* We can't do anything if SEQ's delay insn isn't an 171618334Speter unconditional branch. */ 171718334Speter 171818334Speter if (! simplejump_p (XVECEXP (seq, 0, 0)) 171918334Speter && GET_CODE (PATTERN (XVECEXP (seq, 0, 0))) != RETURN) 172018334Speter return delay_list; 172118334Speter 172218334Speter for (i = 1; i < XVECLEN (seq, 0); i++) 172318334Speter { 172418334Speter rtx trial = XVECEXP (seq, 0, i); 172518334Speter 172618334Speter /* If TRIAL sets CC0, stealing it will move it too far from the use 172718334Speter of CC0. */ 172818334Speter if (insn_references_resource_p (trial, sets, 0) 172918334Speter || insn_sets_resource_p (trial, needed, 0) 173018334Speter || insn_sets_resource_p (trial, sets, 0) 173118334Speter#ifdef HAVE_cc0 173218334Speter || sets_cc0_p (PATTERN (trial)) 173318334Speter#endif 173418334Speter ) 173518334Speter 173618334Speter break; 173718334Speter 173818334Speter /* If this insn was already done, we don't need it. */ 173918334Speter if (redundant_insn (trial, insn, delay_list)) 174018334Speter { 174118334Speter delete_from_delay_slot (trial); 174218334Speter continue; 174318334Speter } 174418334Speter 174518334Speter if (! *pannul_p 174618334Speter && ((condition == const_true_rtx 174718334Speter || (! insn_sets_resource_p (trial, other_needed, 0) 174818334Speter && ! may_trap_p (PATTERN (trial))))) 174918334Speter ? eligible_for_delay (insn, *pslots_filled, trial, flags) 175018334Speter : (*pannul_p = 1, 175118334Speter eligible_for_annul_true (insn, *pslots_filled, trial, flags))) 175218334Speter { 175318334Speter delete_from_delay_slot (trial); 175418334Speter delay_list = add_to_delay_list (trial, delay_list); 175518334Speter 175618334Speter if (++(*pslots_filled) == slots_to_fill) 175718334Speter break; 175818334Speter } 175918334Speter else 176018334Speter break; 176118334Speter } 176218334Speter 176318334Speter return delay_list; 176418334Speter} 176518334Speter 176618334Speter/* Try merging insns starting at THREAD which match exactly the insns in 176718334Speter INSN's delay list. 176818334Speter 176918334Speter If all insns were matched and the insn was previously annulling, the 177018334Speter annul bit will be cleared. 177118334Speter 177218334Speter For each insn that is merged, if the branch is or will be non-annulling, 177318334Speter we delete the merged insn. */ 177418334Speter 177518334Speterstatic void 177618334Spetertry_merge_delay_insns (insn, thread) 177718334Speter rtx insn, thread; 177818334Speter{ 177918334Speter rtx trial, next_trial; 178018334Speter rtx delay_insn = XVECEXP (PATTERN (insn), 0, 0); 178118334Speter int annul_p = INSN_ANNULLED_BRANCH_P (delay_insn); 178218334Speter int slot_number = 1; 178318334Speter int num_slots = XVECLEN (PATTERN (insn), 0); 178418334Speter rtx next_to_match = XVECEXP (PATTERN (insn), 0, slot_number); 178518334Speter struct resources set, needed; 178618334Speter rtx merged_insns = 0; 178718334Speter int i; 178818334Speter int flags; 178918334Speter 179018334Speter flags = get_jump_flags (delay_insn, JUMP_LABEL (delay_insn)); 179118334Speter 179218334Speter CLEAR_RESOURCE (&needed); 179318334Speter CLEAR_RESOURCE (&set); 179418334Speter 179518334Speter /* If this is not an annulling branch, take into account anything needed in 179618334Speter NEXT_TO_MATCH. This prevents two increments from being incorrectly 179718334Speter folded into one. If we are annulling, this would be the correct 179818334Speter thing to do. (The alternative, looking at things set in NEXT_TO_MATCH 179918334Speter will essentially disable this optimization. This method is somewhat of 180018334Speter a kludge, but I don't see a better way.) */ 180118334Speter if (! annul_p) 180218334Speter mark_referenced_resources (next_to_match, &needed, 1); 180318334Speter 180418334Speter for (trial = thread; !stop_search_p (trial, 1); trial = next_trial) 180518334Speter { 180618334Speter rtx pat = PATTERN (trial); 180718334Speter rtx oldtrial = trial; 180818334Speter 180918334Speter next_trial = next_nonnote_insn (trial); 181018334Speter 181118334Speter /* TRIAL must be a CALL_INSN or INSN. Skip USE and CLOBBER. */ 181218334Speter if (GET_CODE (trial) == INSN 181318334Speter && (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER)) 181418334Speter continue; 181518334Speter 181618334Speter if (GET_CODE (next_to_match) == GET_CODE (trial) 181718334Speter#ifdef HAVE_cc0 181818334Speter /* We can't share an insn that sets cc0. */ 181918334Speter && ! sets_cc0_p (pat) 182018334Speter#endif 182118334Speter && ! insn_references_resource_p (trial, &set, 1) 182218334Speter && ! insn_sets_resource_p (trial, &set, 1) 182318334Speter && ! insn_sets_resource_p (trial, &needed, 1) 182418334Speter && (trial = try_split (pat, trial, 0)) != 0 182518334Speter /* Update next_trial, in case try_split succeeded. */ 182618334Speter && (next_trial = next_nonnote_insn (trial)) 182718334Speter /* Likewise THREAD. */ 182818334Speter && (thread = oldtrial == thread ? trial : thread) 182918334Speter && rtx_equal_p (PATTERN (next_to_match), PATTERN (trial)) 183018334Speter /* Have to test this condition if annul condition is different 183118334Speter from (and less restrictive than) non-annulling one. */ 183218334Speter && eligible_for_delay (delay_insn, slot_number - 1, trial, flags)) 183318334Speter { 183418334Speter 183518334Speter if (! annul_p) 183618334Speter { 183718334Speter update_block (trial, thread); 183818334Speter if (trial == thread) 183918334Speter thread = next_active_insn (thread); 184018334Speter 184118334Speter delete_insn (trial); 184218334Speter INSN_FROM_TARGET_P (next_to_match) = 0; 184318334Speter } 184418334Speter else 184518334Speter merged_insns = gen_rtx (INSN_LIST, VOIDmode, trial, merged_insns); 184618334Speter 184718334Speter if (++slot_number == num_slots) 184818334Speter break; 184918334Speter 185018334Speter next_to_match = XVECEXP (PATTERN (insn), 0, slot_number); 185118334Speter if (! annul_p) 185218334Speter mark_referenced_resources (next_to_match, &needed, 1); 185318334Speter } 185418334Speter 185518334Speter mark_set_resources (trial, &set, 0, 1); 185618334Speter mark_referenced_resources (trial, &needed, 1); 185718334Speter } 185818334Speter 185918334Speter /* See if we stopped on a filled insn. If we did, try to see if its 186018334Speter delay slots match. */ 186118334Speter if (slot_number != num_slots 186218334Speter && trial && GET_CODE (trial) == INSN 186318334Speter && GET_CODE (PATTERN (trial)) == SEQUENCE 186418334Speter && ! INSN_ANNULLED_BRANCH_P (XVECEXP (PATTERN (trial), 0, 0))) 186518334Speter { 186618334Speter rtx pat = PATTERN (trial); 186718334Speter rtx filled_insn = XVECEXP (pat, 0, 0); 186818334Speter 186918334Speter /* Account for resources set/needed by the filled insn. */ 187018334Speter mark_set_resources (filled_insn, &set, 0, 1); 187118334Speter mark_referenced_resources (filled_insn, &needed, 1); 187218334Speter 187318334Speter for (i = 1; i < XVECLEN (pat, 0); i++) 187418334Speter { 187518334Speter rtx dtrial = XVECEXP (pat, 0, i); 187618334Speter 187718334Speter if (! insn_references_resource_p (dtrial, &set, 1) 187818334Speter && ! insn_sets_resource_p (dtrial, &set, 1) 187918334Speter && ! insn_sets_resource_p (dtrial, &needed, 1) 188018334Speter#ifdef HAVE_cc0 188118334Speter && ! sets_cc0_p (PATTERN (dtrial)) 188218334Speter#endif 188318334Speter && rtx_equal_p (PATTERN (next_to_match), PATTERN (dtrial)) 188418334Speter && eligible_for_delay (delay_insn, slot_number - 1, dtrial, flags)) 188518334Speter { 188618334Speter if (! annul_p) 188718334Speter { 188818334Speter update_block (dtrial, thread); 188918334Speter delete_from_delay_slot (dtrial); 189018334Speter INSN_FROM_TARGET_P (next_to_match) = 0; 189118334Speter } 189218334Speter else 189318334Speter merged_insns = gen_rtx (INSN_LIST, SImode, dtrial, 189418334Speter merged_insns); 189518334Speter 189618334Speter if (++slot_number == num_slots) 189718334Speter break; 189818334Speter 189918334Speter next_to_match = XVECEXP (PATTERN (insn), 0, slot_number); 190018334Speter } 190118334Speter } 190218334Speter } 190318334Speter 190418334Speter /* If all insns in the delay slot have been matched and we were previously 190518334Speter annulling the branch, we need not any more. In that case delete all the 190618334Speter merged insns. Also clear the INSN_FROM_TARGET_P bit of each insn the 190718334Speter the delay list so that we know that it isn't only being used at the 190818334Speter target. */ 190918334Speter if (slot_number == num_slots && annul_p) 191018334Speter { 191118334Speter for (; merged_insns; merged_insns = XEXP (merged_insns, 1)) 191218334Speter { 191318334Speter if (GET_MODE (merged_insns) == SImode) 191418334Speter { 191518334Speter update_block (XEXP (merged_insns, 0), thread); 191618334Speter delete_from_delay_slot (XEXP (merged_insns, 0)); 191718334Speter } 191818334Speter else 191918334Speter { 192018334Speter update_block (XEXP (merged_insns, 0), thread); 192118334Speter delete_insn (XEXP (merged_insns, 0)); 192218334Speter } 192318334Speter } 192418334Speter 192518334Speter INSN_ANNULLED_BRANCH_P (delay_insn) = 0; 192618334Speter 192718334Speter for (i = 0; i < XVECLEN (PATTERN (insn), 0); i++) 192818334Speter INSN_FROM_TARGET_P (XVECEXP (PATTERN (insn), 0, i)) = 0; 192918334Speter } 193018334Speter} 193118334Speter 193218334Speter/* See if INSN is redundant with an insn in front of TARGET. Often this 193318334Speter is called when INSN is a candidate for a delay slot of TARGET. 193418334Speter DELAY_LIST are insns that will be placed in delay slots of TARGET in front 193518334Speter of INSN. Often INSN will be redundant with an insn in a delay slot of 193618334Speter some previous insn. This happens when we have a series of branches to the 193718334Speter same label; in that case the first insn at the target might want to go 193818334Speter into each of the delay slots. 193918334Speter 194018334Speter If we are not careful, this routine can take up a significant fraction 194118334Speter of the total compilation time (4%), but only wins rarely. Hence we 194218334Speter speed this routine up by making two passes. The first pass goes back 194318334Speter until it hits a label and sees if it find an insn with an identical 194418334Speter pattern. Only in this (relatively rare) event does it check for 194518334Speter data conflicts. 194618334Speter 194718334Speter We do not split insns we encounter. This could cause us not to find a 194818334Speter redundant insn, but the cost of splitting seems greater than the possible 194918334Speter gain in rare cases. */ 195018334Speter 195118334Speterstatic rtx 195218334Speterredundant_insn (insn, target, delay_list) 195318334Speter rtx insn; 195418334Speter rtx target; 195518334Speter rtx delay_list; 195618334Speter{ 195718334Speter rtx target_main = target; 195818334Speter rtx ipat = PATTERN (insn); 195918334Speter rtx trial, pat; 196018334Speter struct resources needed, set; 196118334Speter int i; 196218334Speter 196318334Speter /* Scan backwards looking for a match. */ 196418334Speter for (trial = PREV_INSN (target); trial; trial = PREV_INSN (trial)) 196518334Speter { 196618334Speter if (GET_CODE (trial) == CODE_LABEL) 196718334Speter return 0; 196818334Speter 196918334Speter if (GET_RTX_CLASS (GET_CODE (trial)) != 'i') 197018334Speter continue; 197118334Speter 197218334Speter pat = PATTERN (trial); 197318334Speter if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER) 197418334Speter continue; 197518334Speter 197618334Speter if (GET_CODE (pat) == SEQUENCE) 197718334Speter { 197818334Speter /* Stop for a CALL and its delay slots because it is difficult to 197918334Speter track its resource needs correctly. */ 198018334Speter if (GET_CODE (XVECEXP (pat, 0, 0)) == CALL_INSN) 198118334Speter return 0; 198218334Speter 198318334Speter /* Stop for an INSN or JUMP_INSN with delayed effects and its delay 198418334Speter slots because it is difficult to track its resource needs 198518334Speter correctly. */ 198618334Speter 198718334Speter#ifdef INSN_SETS_ARE_DELAYED 198818334Speter if (INSN_SETS_ARE_DELAYED (XVECEXP (pat, 0, 0))) 198918334Speter return 0; 199018334Speter#endif 199118334Speter 199218334Speter#ifdef INSN_REFERENCES_ARE_DELAYED 199318334Speter if (INSN_REFERENCES_ARE_DELAYED (XVECEXP (pat, 0, 0))) 199418334Speter return 0; 199518334Speter#endif 199618334Speter 199718334Speter /* See if any of the insns in the delay slot match, updating 199818334Speter resource requirements as we go. */ 199918334Speter for (i = XVECLEN (pat, 0) - 1; i > 0; i--) 200018334Speter if (GET_CODE (XVECEXP (pat, 0, i)) == GET_CODE (insn) 200118334Speter && rtx_equal_p (PATTERN (XVECEXP (pat, 0, i)), ipat)) 200218334Speter break; 200318334Speter 200418334Speter /* If found a match, exit this loop early. */ 200518334Speter if (i > 0) 200618334Speter break; 200718334Speter } 200818334Speter 200918334Speter else if (GET_CODE (trial) == GET_CODE (insn) && rtx_equal_p (pat, ipat)) 201018334Speter break; 201118334Speter } 201218334Speter 201318334Speter /* If we didn't find an insn that matches, return 0. */ 201418334Speter if (trial == 0) 201518334Speter return 0; 201618334Speter 201718334Speter /* See what resources this insn sets and needs. If they overlap, or 201818334Speter if this insn references CC0, it can't be redundant. */ 201918334Speter 202018334Speter CLEAR_RESOURCE (&needed); 202118334Speter CLEAR_RESOURCE (&set); 202218334Speter mark_set_resources (insn, &set, 0, 1); 202318334Speter mark_referenced_resources (insn, &needed, 1); 202418334Speter 202518334Speter /* If TARGET is a SEQUENCE, get the main insn. */ 202618334Speter if (GET_CODE (target) == INSN && GET_CODE (PATTERN (target)) == SEQUENCE) 202718334Speter target_main = XVECEXP (PATTERN (target), 0, 0); 202818334Speter 202918334Speter if (resource_conflicts_p (&needed, &set) 203018334Speter#ifdef HAVE_cc0 203118334Speter || reg_mentioned_p (cc0_rtx, ipat) 203218334Speter#endif 203318334Speter /* The insn requiring the delay may not set anything needed or set by 203418334Speter INSN. */ 203518334Speter || insn_sets_resource_p (target_main, &needed, 1) 203618334Speter || insn_sets_resource_p (target_main, &set, 1)) 203718334Speter return 0; 203818334Speter 203918334Speter /* Insns we pass may not set either NEEDED or SET, so merge them for 204018334Speter simpler tests. */ 204118334Speter needed.memory |= set.memory; 204218334Speter needed.unch_memory |= set.unch_memory; 204318334Speter IOR_HARD_REG_SET (needed.regs, set.regs); 204418334Speter 204518334Speter /* This insn isn't redundant if it conflicts with an insn that either is 204618334Speter or will be in a delay slot of TARGET. */ 204718334Speter 204818334Speter while (delay_list) 204918334Speter { 205018334Speter if (insn_sets_resource_p (XEXP (delay_list, 0), &needed, 1)) 205118334Speter return 0; 205218334Speter delay_list = XEXP (delay_list, 1); 205318334Speter } 205418334Speter 205518334Speter if (GET_CODE (target) == INSN && GET_CODE (PATTERN (target)) == SEQUENCE) 205618334Speter for (i = 1; i < XVECLEN (PATTERN (target), 0); i++) 205718334Speter if (insn_sets_resource_p (XVECEXP (PATTERN (target), 0, i), &needed, 1)) 205818334Speter return 0; 205918334Speter 206018334Speter /* Scan backwards until we reach a label or an insn that uses something 206118334Speter INSN sets or sets something insn uses or sets. */ 206218334Speter 206318334Speter for (trial = PREV_INSN (target); 206418334Speter trial && GET_CODE (trial) != CODE_LABEL; 206518334Speter trial = PREV_INSN (trial)) 206618334Speter { 206718334Speter if (GET_CODE (trial) != INSN && GET_CODE (trial) != CALL_INSN 206818334Speter && GET_CODE (trial) != JUMP_INSN) 206918334Speter continue; 207018334Speter 207118334Speter pat = PATTERN (trial); 207218334Speter if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER) 207318334Speter continue; 207418334Speter 207518334Speter if (GET_CODE (pat) == SEQUENCE) 207618334Speter { 207718334Speter /* If this is a CALL_INSN and its delay slots, it is hard to track 207818334Speter the resource needs properly, so give up. */ 207918334Speter if (GET_CODE (XVECEXP (pat, 0, 0)) == CALL_INSN) 208018334Speter return 0; 208118334Speter 208218334Speter /* If this this is an INSN or JUMP_INSN with delayed effects, it 208318334Speter is hard to track the resource needs properly, so give up. */ 208418334Speter 208518334Speter#ifdef INSN_SETS_ARE_DELAYED 208618334Speter if (INSN_SETS_ARE_DELAYED (XVECEXP (pat, 0, 0))) 208718334Speter return 0; 208818334Speter#endif 208918334Speter 209018334Speter#ifdef INSN_REFERENCES_ARE_DELAYED 209118334Speter if (INSN_REFERENCES_ARE_DELAYED (XVECEXP (pat, 0, 0))) 209218334Speter return 0; 209318334Speter#endif 209418334Speter 209518334Speter /* See if any of the insns in the delay slot match, updating 209618334Speter resource requirements as we go. */ 209718334Speter for (i = XVECLEN (pat, 0) - 1; i > 0; i--) 209818334Speter { 209918334Speter rtx candidate = XVECEXP (pat, 0, i); 210018334Speter 210118334Speter /* If an insn will be annulled if the branch is false, it isn't 210218334Speter considered as a possible duplicate insn. */ 210318334Speter if (rtx_equal_p (PATTERN (candidate), ipat) 210418334Speter && ! (INSN_ANNULLED_BRANCH_P (XVECEXP (pat, 0, 0)) 210518334Speter && INSN_FROM_TARGET_P (candidate))) 210618334Speter { 210718334Speter /* Show that this insn will be used in the sequel. */ 210818334Speter INSN_FROM_TARGET_P (candidate) = 0; 210918334Speter return candidate; 211018334Speter } 211118334Speter 211218334Speter /* Unless this is an annulled insn from the target of a branch, 211318334Speter we must stop if it sets anything needed or set by INSN. */ 211418334Speter if ((! INSN_ANNULLED_BRANCH_P (XVECEXP (pat, 0, 0)) 211518334Speter || ! INSN_FROM_TARGET_P (candidate)) 211618334Speter && insn_sets_resource_p (candidate, &needed, 1)) 211718334Speter return 0; 211818334Speter } 211918334Speter 212018334Speter 212118334Speter /* If the insn requiring the delay slot conflicts with INSN, we 212218334Speter must stop. */ 212318334Speter if (insn_sets_resource_p (XVECEXP (pat, 0, 0), &needed, 1)) 212418334Speter return 0; 212518334Speter } 212618334Speter else 212718334Speter { 212818334Speter /* See if TRIAL is the same as INSN. */ 212918334Speter pat = PATTERN (trial); 213018334Speter if (rtx_equal_p (pat, ipat)) 213118334Speter return trial; 213218334Speter 213318334Speter /* Can't go any further if TRIAL conflicts with INSN. */ 213418334Speter if (insn_sets_resource_p (trial, &needed, 1)) 213518334Speter return 0; 213618334Speter } 213718334Speter } 213818334Speter 213918334Speter return 0; 214018334Speter} 214118334Speter 214218334Speter/* Return 1 if THREAD can only be executed in one way. If LABEL is non-zero, 214318334Speter it is the target of the branch insn being scanned. If ALLOW_FALLTHROUGH 214418334Speter is non-zero, we are allowed to fall into this thread; otherwise, we are 214518334Speter not. 214618334Speter 214718334Speter If LABEL is used more than one or we pass a label other than LABEL before 214818334Speter finding an active insn, we do not own this thread. */ 214918334Speter 215018334Speterstatic int 215118334Speterown_thread_p (thread, label, allow_fallthrough) 215218334Speter rtx thread; 215318334Speter rtx label; 215418334Speter int allow_fallthrough; 215518334Speter{ 215618334Speter rtx active_insn; 215718334Speter rtx insn; 215818334Speter 215918334Speter /* We don't own the function end. */ 216018334Speter if (thread == 0) 216118334Speter return 0; 216218334Speter 216318334Speter /* Get the first active insn, or THREAD, if it is an active insn. */ 216418334Speter active_insn = next_active_insn (PREV_INSN (thread)); 216518334Speter 216618334Speter for (insn = thread; insn != active_insn; insn = NEXT_INSN (insn)) 216718334Speter if (GET_CODE (insn) == CODE_LABEL 216818334Speter && (insn != label || LABEL_NUSES (insn) != 1)) 216918334Speter return 0; 217018334Speter 217118334Speter if (allow_fallthrough) 217218334Speter return 1; 217318334Speter 217418334Speter /* Ensure that we reach a BARRIER before any insn or label. */ 217518334Speter for (insn = prev_nonnote_insn (thread); 217618334Speter insn == 0 || GET_CODE (insn) != BARRIER; 217718334Speter insn = prev_nonnote_insn (insn)) 217818334Speter if (insn == 0 217918334Speter || GET_CODE (insn) == CODE_LABEL 218018334Speter || (GET_CODE (insn) == INSN 218118334Speter && GET_CODE (PATTERN (insn)) != USE 218218334Speter && GET_CODE (PATTERN (insn)) != CLOBBER)) 218318334Speter return 0; 218418334Speter 218518334Speter return 1; 218618334Speter} 218718334Speter 218818334Speter/* Find the number of the basic block that starts closest to INSN. Return -1 218918334Speter if we couldn't find such a basic block. */ 219018334Speter 219118334Speterstatic int 219218334Speterfind_basic_block (insn) 219318334Speter rtx insn; 219418334Speter{ 219518334Speter int i; 219618334Speter 219718334Speter /* Scan backwards to the previous BARRIER. Then see if we can find a 219818334Speter label that starts a basic block. Return the basic block number. */ 219918334Speter 220018334Speter for (insn = prev_nonnote_insn (insn); 220118334Speter insn && GET_CODE (insn) != BARRIER; 220218334Speter insn = prev_nonnote_insn (insn)) 220318334Speter ; 220418334Speter 220518334Speter /* The start of the function is basic block zero. */ 220618334Speter if (insn == 0) 220718334Speter return 0; 220818334Speter 220918334Speter /* See if any of the upcoming CODE_LABELs start a basic block. If we reach 221018334Speter anything other than a CODE_LABEL or note, we can't find this code. */ 221118334Speter for (insn = next_nonnote_insn (insn); 221218334Speter insn && GET_CODE (insn) == CODE_LABEL; 221318334Speter insn = next_nonnote_insn (insn)) 221418334Speter { 221518334Speter for (i = 0; i < n_basic_blocks; i++) 221618334Speter if (insn == basic_block_head[i]) 221718334Speter return i; 221818334Speter } 221918334Speter 222018334Speter return -1; 222118334Speter} 222218334Speter 222318334Speter/* Called when INSN is being moved from a location near the target of a jump. 222418334Speter We leave a marker of the form (use (INSN)) immediately in front 222518334Speter of WHERE for mark_target_live_regs. These markers will be deleted when 222618334Speter reorg finishes. 222718334Speter 222818334Speter We used to try to update the live status of registers if WHERE is at 222918334Speter the start of a basic block, but that can't work since we may remove a 223018334Speter BARRIER in relax_delay_slots. */ 223118334Speter 223218334Speterstatic void 223318334Speterupdate_block (insn, where) 223418334Speter rtx insn; 223518334Speter rtx where; 223618334Speter{ 223718334Speter int b; 223818334Speter 223918334Speter /* Ignore if this was in a delay slot and it came from the target of 224018334Speter a branch. */ 224118334Speter if (INSN_FROM_TARGET_P (insn)) 224218334Speter return; 224318334Speter 224418334Speter emit_insn_before (gen_rtx (USE, VOIDmode, insn), where); 224518334Speter 224618334Speter /* INSN might be making a value live in a block where it didn't use to 224718334Speter be. So recompute liveness information for this block. */ 224818334Speter 224918334Speter b = find_basic_block (insn); 225018334Speter if (b != -1) 225118334Speter bb_ticks[b]++; 225218334Speter} 225318334Speter 225418334Speter/* Similar to REDIRECT_JUMP except that we update the BB_TICKS entry for 225518334Speter the basic block containing the jump. */ 225618334Speter 225718334Speterstatic int 225818334Speterreorg_redirect_jump (jump, nlabel) 225918334Speter rtx jump; 226018334Speter rtx nlabel; 226118334Speter{ 226218334Speter int b = find_basic_block (jump); 226318334Speter 226418334Speter if (b != -1) 226518334Speter bb_ticks[b]++; 226618334Speter 226718334Speter return redirect_jump (jump, nlabel); 226818334Speter} 226918334Speter 227018334Speter/* Called when INSN is being moved forward into a delay slot of DELAYED_INSN. 227118334Speter We check every instruction between INSN and DELAYED_INSN for REG_DEAD notes 227218334Speter that reference values used in INSN. If we find one, then we move the 227318334Speter REG_DEAD note to INSN. 227418334Speter 227518334Speter This is needed to handle the case where an later insn (after INSN) has a 227618334Speter REG_DEAD note for a register used by INSN, and this later insn subsequently 227718334Speter gets moved before a CODE_LABEL because it is a redundant insn. In this 227818334Speter case, mark_target_live_regs may be confused into thinking the register 227918334Speter is dead because it sees a REG_DEAD note immediately before a CODE_LABEL. */ 228018334Speter 228118334Speterstatic void 228218334Speterupdate_reg_dead_notes (insn, delayed_insn) 228318334Speter rtx insn, delayed_insn; 228418334Speter{ 228518334Speter rtx p, link, next; 228618334Speter 228718334Speter for (p = next_nonnote_insn (insn); p != delayed_insn; 228818334Speter p = next_nonnote_insn (p)) 228918334Speter for (link = REG_NOTES (p); link; link = next) 229018334Speter { 229118334Speter next = XEXP (link, 1); 229218334Speter 229318334Speter if (REG_NOTE_KIND (link) != REG_DEAD 229418334Speter || GET_CODE (XEXP (link, 0)) != REG) 229518334Speter continue; 229618334Speter 229718334Speter if (reg_referenced_p (XEXP (link, 0), PATTERN (insn))) 229818334Speter { 229918334Speter /* Move the REG_DEAD note from P to INSN. */ 230018334Speter remove_note (p, link); 230118334Speter XEXP (link, 1) = REG_NOTES (insn); 230218334Speter REG_NOTES (insn) = link; 230318334Speter } 230418334Speter } 230518334Speter} 230618334Speter 230718334Speter/* Delete any REG_UNUSED notes that exist on INSN but not on REDUNDANT_INSN. 230818334Speter 230918334Speter This handles the case of udivmodXi4 instructions which optimize their 231018334Speter output depending on whether any REG_UNUSED notes are present. 231118334Speter we must make sure that INSN calculates as many results as REDUNDANT_INSN 231218334Speter does. */ 231318334Speter 231418334Speterstatic void 231518334Speterupdate_reg_unused_notes (insn, redundant_insn) 231618334Speter rtx insn, redundant_insn; 231718334Speter{ 231818334Speter rtx p, link, next; 231918334Speter 232018334Speter for (link = REG_NOTES (insn); link; link = next) 232118334Speter { 232218334Speter next = XEXP (link, 1); 232318334Speter 232418334Speter if (REG_NOTE_KIND (link) != REG_UNUSED 232518334Speter || GET_CODE (XEXP (link, 0)) != REG) 232618334Speter continue; 232718334Speter 232818334Speter if (! find_regno_note (redundant_insn, REG_UNUSED, 232918334Speter REGNO (XEXP (link, 0)))) 233018334Speter remove_note (insn, link); 233118334Speter } 233218334Speter} 233318334Speter 233418334Speter/* Marks registers possibly live at the current place being scanned by 233518334Speter mark_target_live_regs. Used only by next two function. */ 233618334Speter 233718334Speterstatic HARD_REG_SET current_live_regs; 233818334Speter 233918334Speter/* Marks registers for which we have seen a REG_DEAD note but no assignment. 234018334Speter Also only used by the next two functions. */ 234118334Speter 234218334Speterstatic HARD_REG_SET pending_dead_regs; 234318334Speter 234418334Speter/* Utility function called from mark_target_live_regs via note_stores. 234518334Speter It deadens any CLOBBERed registers and livens any SET registers. */ 234618334Speter 234718334Speterstatic void 234818334Speterupdate_live_status (dest, x) 234918334Speter rtx dest; 235018334Speter rtx x; 235118334Speter{ 235218334Speter int first_regno, last_regno; 235318334Speter int i; 235418334Speter 235518334Speter if (GET_CODE (dest) != REG 235618334Speter && (GET_CODE (dest) != SUBREG || GET_CODE (SUBREG_REG (dest)) != REG)) 235718334Speter return; 235818334Speter 235918334Speter if (GET_CODE (dest) == SUBREG) 236018334Speter first_regno = REGNO (SUBREG_REG (dest)) + SUBREG_WORD (dest); 236118334Speter else 236218334Speter first_regno = REGNO (dest); 236318334Speter 236418334Speter last_regno = first_regno + HARD_REGNO_NREGS (first_regno, GET_MODE (dest)); 236518334Speter 236618334Speter if (GET_CODE (x) == CLOBBER) 236718334Speter for (i = first_regno; i < last_regno; i++) 236818334Speter CLEAR_HARD_REG_BIT (current_live_regs, i); 236918334Speter else 237018334Speter for (i = first_regno; i < last_regno; i++) 237118334Speter { 237218334Speter SET_HARD_REG_BIT (current_live_regs, i); 237318334Speter CLEAR_HARD_REG_BIT (pending_dead_regs, i); 237418334Speter } 237518334Speter} 237618334Speter 237718334Speter/* Similar to next_insn, but ignores insns in the delay slots of 237818334Speter an annulled branch. */ 237918334Speter 238018334Speterstatic rtx 238118334Speternext_insn_no_annul (insn) 238218334Speter rtx insn; 238318334Speter{ 238418334Speter if (insn) 238518334Speter { 238618334Speter /* If INSN is an annulled branch, skip any insns from the target 238718334Speter of the branch. */ 238818334Speter if (INSN_ANNULLED_BRANCH_P (insn) 238918334Speter && NEXT_INSN (PREV_INSN (insn)) != insn) 239018334Speter while (INSN_FROM_TARGET_P (NEXT_INSN (insn))) 239118334Speter insn = NEXT_INSN (insn); 239218334Speter 239318334Speter insn = NEXT_INSN (insn); 239418334Speter if (insn && GET_CODE (insn) == INSN 239518334Speter && GET_CODE (PATTERN (insn)) == SEQUENCE) 239618334Speter insn = XVECEXP (PATTERN (insn), 0, 0); 239718334Speter } 239818334Speter 239918334Speter return insn; 240018334Speter} 240118334Speter 240218334Speter/* Set the resources that are live at TARGET. 240318334Speter 240418334Speter If TARGET is zero, we refer to the end of the current function and can 240518334Speter return our precomputed value. 240618334Speter 240718334Speter Otherwise, we try to find out what is live by consulting the basic block 240818334Speter information. This is tricky, because we must consider the actions of 240918334Speter reload and jump optimization, which occur after the basic block information 241018334Speter has been computed. 241118334Speter 241218334Speter Accordingly, we proceed as follows:: 241318334Speter 241418334Speter We find the previous BARRIER and look at all immediately following labels 241518334Speter (with no intervening active insns) to see if any of them start a basic 241618334Speter block. If we hit the start of the function first, we use block 0. 241718334Speter 241818334Speter Once we have found a basic block and a corresponding first insns, we can 241918334Speter accurately compute the live status from basic_block_live_regs and 242018334Speter reg_renumber. (By starting at a label following a BARRIER, we are immune 242118334Speter to actions taken by reload and jump.) Then we scan all insns between 242218334Speter that point and our target. For each CLOBBER (or for call-clobbered regs 242318334Speter when we pass a CALL_INSN), mark the appropriate registers are dead. For 242418334Speter a SET, mark them as live. 242518334Speter 242618334Speter We have to be careful when using REG_DEAD notes because they are not 242718334Speter updated by such things as find_equiv_reg. So keep track of registers 242818334Speter marked as dead that haven't been assigned to, and mark them dead at the 242918334Speter next CODE_LABEL since reload and jump won't propagate values across labels. 243018334Speter 243118334Speter If we cannot find the start of a basic block (should be a very rare 243218334Speter case, if it can happen at all), mark everything as potentially live. 243318334Speter 243418334Speter Next, scan forward from TARGET looking for things set or clobbered 243518334Speter before they are used. These are not live. 243618334Speter 243718334Speter Because we can be called many times on the same target, save our results 243818334Speter in a hash table indexed by INSN_UID. */ 243918334Speter 244018334Speterstatic void 244118334Spetermark_target_live_regs (target, res) 244218334Speter rtx target; 244318334Speter struct resources *res; 244418334Speter{ 244518334Speter int b = -1; 244618334Speter int i; 244718334Speter struct target_info *tinfo; 244818334Speter rtx insn, next; 244918334Speter rtx jump_insn = 0; 245018334Speter rtx jump_target; 245118334Speter HARD_REG_SET scratch; 245218334Speter struct resources set, needed; 245318334Speter int jump_count = 0; 245418334Speter 245518334Speter /* Handle end of function. */ 245618334Speter if (target == 0) 245718334Speter { 245818334Speter *res = end_of_function_needs; 245918334Speter return; 246018334Speter } 246118334Speter 246218334Speter /* We have to assume memory is needed, but the CC isn't. */ 246318334Speter res->memory = 1; 246418334Speter res->volatil = res->unch_memory = 0; 246518334Speter res->cc = 0; 246618334Speter 246718334Speter /* See if we have computed this value already. */ 246818334Speter for (tinfo = target_hash_table[INSN_UID (target) % TARGET_HASH_PRIME]; 246918334Speter tinfo; tinfo = tinfo->next) 247018334Speter if (tinfo->uid == INSN_UID (target)) 247118334Speter break; 247218334Speter 247318334Speter /* Start by getting the basic block number. If we have saved information, 247418334Speter we can get it from there unless the insn at the start of the basic block 247518334Speter has been deleted. */ 247618334Speter if (tinfo && tinfo->block != -1 247718334Speter && ! INSN_DELETED_P (basic_block_head[tinfo->block])) 247818334Speter b = tinfo->block; 247918334Speter 248018334Speter if (b == -1) 248118334Speter b = find_basic_block (target); 248218334Speter 248318334Speter if (tinfo) 248418334Speter { 248518334Speter /* If the information is up-to-date, use it. Otherwise, we will 248618334Speter update it below. */ 248718334Speter if (b == tinfo->block && b != -1 && tinfo->bb_tick == bb_ticks[b]) 248818334Speter { 248918334Speter COPY_HARD_REG_SET (res->regs, tinfo->live_regs); 249018334Speter return; 249118334Speter } 249218334Speter } 249318334Speter else 249418334Speter { 249518334Speter /* Allocate a place to put our results and chain it into the 249618334Speter hash table. */ 249718334Speter tinfo = (struct target_info *) oballoc (sizeof (struct target_info)); 249818334Speter tinfo->uid = INSN_UID (target); 249918334Speter tinfo->block = b; 250018334Speter tinfo->next = target_hash_table[INSN_UID (target) % TARGET_HASH_PRIME]; 250118334Speter target_hash_table[INSN_UID (target) % TARGET_HASH_PRIME] = tinfo; 250218334Speter } 250318334Speter 250418334Speter CLEAR_HARD_REG_SET (pending_dead_regs); 250518334Speter 250618334Speter /* If we found a basic block, get the live registers from it and update 250718334Speter them with anything set or killed between its start and the insn before 250818334Speter TARGET. Otherwise, we must assume everything is live. */ 250918334Speter if (b != -1) 251018334Speter { 251118334Speter regset regs_live = basic_block_live_at_start[b]; 251218334Speter int offset, j; 251318334Speter REGSET_ELT_TYPE bit; 251418334Speter int regno; 251518334Speter rtx start_insn, stop_insn; 251618334Speter 251718334Speter /* Compute hard regs live at start of block -- this is the real hard regs 251818334Speter marked live, plus live pseudo regs that have been renumbered to 251918334Speter hard regs. */ 252018334Speter 252118334Speter#ifdef HARD_REG_SET 252218334Speter current_live_regs = *regs_live; 252318334Speter#else 252418334Speter COPY_HARD_REG_SET (current_live_regs, regs_live); 252518334Speter#endif 252618334Speter 252718334Speter for (offset = 0, i = 0; offset < regset_size; offset++) 252818334Speter { 252918334Speter if (regs_live[offset] == 0) 253018334Speter i += REGSET_ELT_BITS; 253118334Speter else 253218334Speter for (bit = 1; bit && i < max_regno; bit <<= 1, i++) 253318334Speter if ((regs_live[offset] & bit) 253418334Speter && (regno = reg_renumber[i]) >= 0) 253518334Speter for (j = regno; 253618334Speter j < regno + HARD_REGNO_NREGS (regno, 253718334Speter PSEUDO_REGNO_MODE (i)); 253818334Speter j++) 253918334Speter SET_HARD_REG_BIT (current_live_regs, j); 254018334Speter } 254118334Speter 254218334Speter /* Get starting and ending insn, handling the case where each might 254318334Speter be a SEQUENCE. */ 254418334Speter start_insn = (b == 0 ? get_insns () : basic_block_head[b]); 254518334Speter stop_insn = target; 254618334Speter 254718334Speter if (GET_CODE (start_insn) == INSN 254818334Speter && GET_CODE (PATTERN (start_insn)) == SEQUENCE) 254918334Speter start_insn = XVECEXP (PATTERN (start_insn), 0, 0); 255018334Speter 255118334Speter if (GET_CODE (stop_insn) == INSN 255218334Speter && GET_CODE (PATTERN (stop_insn)) == SEQUENCE) 255318334Speter stop_insn = next_insn (PREV_INSN (stop_insn)); 255418334Speter 255518334Speter for (insn = start_insn; insn != stop_insn; 255618334Speter insn = next_insn_no_annul (insn)) 255718334Speter { 255818334Speter rtx link; 255918334Speter rtx real_insn = insn; 256018334Speter 256118334Speter /* If this insn is from the target of a branch, it isn't going to 256218334Speter be used in the sequel. If it is used in both cases, this 256318334Speter test will not be true. */ 256418334Speter if (INSN_FROM_TARGET_P (insn)) 256518334Speter continue; 256618334Speter 256718334Speter /* If this insn is a USE made by update_block, we care about the 256818334Speter underlying insn. */ 256918334Speter if (GET_CODE (insn) == INSN && GET_CODE (PATTERN (insn)) == USE 257018334Speter && GET_RTX_CLASS (GET_CODE (XEXP (PATTERN (insn), 0))) == 'i') 257118334Speter real_insn = XEXP (PATTERN (insn), 0); 257218334Speter 257318334Speter if (GET_CODE (real_insn) == CALL_INSN) 257418334Speter { 257518334Speter /* CALL clobbers all call-used regs that aren't fixed except 257618334Speter sp, ap, and fp. Do this before setting the result of the 257718334Speter call live. */ 257818334Speter for (i = 0; i < FIRST_PSEUDO_REGISTER; i++) 257918334Speter if (call_used_regs[i] 258018334Speter && i != STACK_POINTER_REGNUM && i != FRAME_POINTER_REGNUM 258118334Speter && i != ARG_POINTER_REGNUM 258218334Speter#if HARD_FRAME_POINTER_REGNUM != FRAME_POINTER_REGNUM 258318334Speter && i != HARD_FRAME_POINTER_REGNUM 258418334Speter#endif 258518334Speter#if ARG_POINTER_REGNUM != FRAME_POINTER_REGNUM 258618334Speter && ! (i == ARG_POINTER_REGNUM && fixed_regs[i]) 258718334Speter#endif 258818334Speter#ifdef PIC_OFFSET_TABLE_REGNUM 258918334Speter && ! (i == PIC_OFFSET_TABLE_REGNUM && flag_pic) 259018334Speter#endif 259118334Speter ) 259218334Speter CLEAR_HARD_REG_BIT (current_live_regs, i); 259318334Speter 259418334Speter /* A CALL_INSN sets any global register live, since it may 259518334Speter have been modified by the call. */ 259618334Speter for (i = 0; i < FIRST_PSEUDO_REGISTER; i++) 259718334Speter if (global_regs[i]) 259818334Speter SET_HARD_REG_BIT (current_live_regs, i); 259918334Speter } 260018334Speter 260118334Speter /* Mark anything killed in an insn to be deadened at the next 260218334Speter label. Ignore USE insns; the only REG_DEAD notes will be for 260318334Speter parameters. But they might be early. A CALL_INSN will usually 260418334Speter clobber registers used for parameters. It isn't worth bothering 260518334Speter with the unlikely case when it won't. */ 260618334Speter if ((GET_CODE (real_insn) == INSN 260718334Speter && GET_CODE (PATTERN (real_insn)) != USE 260818334Speter && GET_CODE (PATTERN (real_insn)) != CLOBBER) 260918334Speter || GET_CODE (real_insn) == JUMP_INSN 261018334Speter || GET_CODE (real_insn) == CALL_INSN) 261118334Speter { 261218334Speter for (link = REG_NOTES (real_insn); link; link = XEXP (link, 1)) 261318334Speter if (REG_NOTE_KIND (link) == REG_DEAD 261418334Speter && GET_CODE (XEXP (link, 0)) == REG 261518334Speter && REGNO (XEXP (link, 0)) < FIRST_PSEUDO_REGISTER) 261618334Speter { 261718334Speter int first_regno = REGNO (XEXP (link, 0)); 261818334Speter int last_regno 261918334Speter = (first_regno 262018334Speter + HARD_REGNO_NREGS (first_regno, 262118334Speter GET_MODE (XEXP (link, 0)))); 262218334Speter 262318334Speter for (i = first_regno; i < last_regno; i++) 262418334Speter SET_HARD_REG_BIT (pending_dead_regs, i); 262518334Speter } 262618334Speter 262718334Speter note_stores (PATTERN (real_insn), update_live_status); 262818334Speter 262918334Speter /* If any registers were unused after this insn, kill them. 263018334Speter These notes will always be accurate. */ 263118334Speter for (link = REG_NOTES (real_insn); link; link = XEXP (link, 1)) 263218334Speter if (REG_NOTE_KIND (link) == REG_UNUSED 263318334Speter && GET_CODE (XEXP (link, 0)) == REG 263418334Speter && REGNO (XEXP (link, 0)) < FIRST_PSEUDO_REGISTER) 263518334Speter { 263618334Speter int first_regno = REGNO (XEXP (link, 0)); 263718334Speter int last_regno 263818334Speter = (first_regno 263918334Speter + HARD_REGNO_NREGS (first_regno, 264018334Speter GET_MODE (XEXP (link, 0)))); 264118334Speter 264218334Speter for (i = first_regno; i < last_regno; i++) 264318334Speter CLEAR_HARD_REG_BIT (current_live_regs, i); 264418334Speter } 264518334Speter } 264618334Speter 264718334Speter else if (GET_CODE (real_insn) == CODE_LABEL) 264818334Speter { 264918334Speter /* A label clobbers the pending dead registers since neither 265018334Speter reload nor jump will propagate a value across a label. */ 265118334Speter AND_COMPL_HARD_REG_SET (current_live_regs, pending_dead_regs); 265218334Speter CLEAR_HARD_REG_SET (pending_dead_regs); 265318334Speter } 265418334Speter 265518334Speter /* The beginning of the epilogue corresponds to the end of the 265618334Speter RTL chain when there are no epilogue insns. Certain resources 265718334Speter are implicitly required at that point. */ 265818334Speter else if (GET_CODE (real_insn) == NOTE 265918334Speter && NOTE_LINE_NUMBER (real_insn) == NOTE_INSN_EPILOGUE_BEG) 266018334Speter IOR_HARD_REG_SET (current_live_regs, start_of_epilogue_needs.regs); 266118334Speter } 266218334Speter 266318334Speter COPY_HARD_REG_SET (res->regs, current_live_regs); 266418334Speter tinfo->block = b; 266518334Speter tinfo->bb_tick = bb_ticks[b]; 266618334Speter } 266718334Speter else 266818334Speter /* We didn't find the start of a basic block. Assume everything 266918334Speter in use. This should happen only extremely rarely. */ 267018334Speter SET_HARD_REG_SET (res->regs); 267118334Speter 267218334Speter /* Now step forward from TARGET looking for registers that are set before 267318334Speter they are used. These are dead. If we pass a label, any pending dead 267418334Speter registers that weren't yet used can be made dead. Stop when we pass a 267518334Speter conditional JUMP_INSN; follow the first few unconditional branches. */ 267618334Speter 267718334Speter CLEAR_RESOURCE (&set); 267818334Speter CLEAR_RESOURCE (&needed); 267918334Speter 268018334Speter for (insn = target; insn; insn = next) 268118334Speter { 268218334Speter rtx this_jump_insn = insn; 268318334Speter 268418334Speter next = NEXT_INSN (insn); 268518334Speter switch (GET_CODE (insn)) 268618334Speter { 268718334Speter case CODE_LABEL: 268818334Speter AND_COMPL_HARD_REG_SET (pending_dead_regs, needed.regs); 268918334Speter AND_COMPL_HARD_REG_SET (res->regs, pending_dead_regs); 269018334Speter CLEAR_HARD_REG_SET (pending_dead_regs); 269118334Speter continue; 269218334Speter 269318334Speter case BARRIER: 269418334Speter case NOTE: 269518334Speter continue; 269618334Speter 269718334Speter case INSN: 269818334Speter if (GET_CODE (PATTERN (insn)) == USE) 269918334Speter { 270018334Speter /* If INSN is a USE made by update_block, we care about the 270118334Speter underlying insn. Any registers set by the underlying insn 270218334Speter are live since the insn is being done somewhere else. */ 270318334Speter if (GET_RTX_CLASS (GET_CODE (XEXP (PATTERN (insn), 0))) == 'i') 270418334Speter mark_set_resources (XEXP (PATTERN (insn), 0), res, 0, 1); 270518334Speter 270618334Speter /* All other USE insns are to be ignored. */ 270718334Speter continue; 270818334Speter } 270918334Speter else if (GET_CODE (PATTERN (insn)) == CLOBBER) 271018334Speter continue; 271118334Speter else if (GET_CODE (PATTERN (insn)) == SEQUENCE) 271218334Speter { 271318334Speter /* An unconditional jump can be used to fill the delay slot 271418334Speter of a call, so search for a JUMP_INSN in any position. */ 271518334Speter for (i = 0; i < XVECLEN (PATTERN (insn), 0); i++) 271618334Speter { 271718334Speter this_jump_insn = XVECEXP (PATTERN (insn), 0, i); 271818334Speter if (GET_CODE (this_jump_insn) == JUMP_INSN) 271918334Speter break; 272018334Speter } 272118334Speter } 272218334Speter } 272318334Speter 272418334Speter if (GET_CODE (this_jump_insn) == JUMP_INSN) 272518334Speter { 272618334Speter if (jump_count++ < 10 272718334Speter && (simplejump_p (this_jump_insn) 272818334Speter || GET_CODE (PATTERN (this_jump_insn)) == RETURN)) 272918334Speter { 273018334Speter next = next_active_insn (JUMP_LABEL (this_jump_insn)); 273118334Speter if (jump_insn == 0) 273218334Speter { 273318334Speter jump_insn = insn; 273418334Speter jump_target = JUMP_LABEL (this_jump_insn); 273518334Speter } 273618334Speter } 273718334Speter else 273818334Speter break; 273918334Speter } 274018334Speter 274118334Speter mark_referenced_resources (insn, &needed, 1); 274218334Speter mark_set_resources (insn, &set, 0, 1); 274318334Speter 274418334Speter COPY_HARD_REG_SET (scratch, set.regs); 274518334Speter AND_COMPL_HARD_REG_SET (scratch, needed.regs); 274618334Speter AND_COMPL_HARD_REG_SET (res->regs, scratch); 274718334Speter } 274818334Speter 274918334Speter /* If we hit an unconditional branch, we have another way of finding out 275018334Speter what is live: we can see what is live at the branch target and include 275118334Speter anything used but not set before the branch. The only things that are 275218334Speter live are those that are live using the above test and the test below. 275318334Speter 275418334Speter Don't try this if we expired our jump count above, since that would 275518334Speter mean there may be an infinite loop in the function being compiled. */ 275618334Speter 275718334Speter if (jump_insn && jump_count < 10) 275818334Speter { 275918334Speter struct resources new_resources; 276018334Speter rtx stop_insn = next_active_insn (jump_insn); 276118334Speter 276218334Speter mark_target_live_regs (next_active_insn (jump_target), &new_resources); 276318334Speter CLEAR_RESOURCE (&set); 276418334Speter CLEAR_RESOURCE (&needed); 276518334Speter 276618334Speter /* Include JUMP_INSN in the needed registers. */ 276718334Speter for (insn = target; insn != stop_insn; insn = next_active_insn (insn)) 276818334Speter { 276918334Speter mark_referenced_resources (insn, &needed, 1); 277018334Speter 277118334Speter COPY_HARD_REG_SET (scratch, needed.regs); 277218334Speter AND_COMPL_HARD_REG_SET (scratch, set.regs); 277318334Speter IOR_HARD_REG_SET (new_resources.regs, scratch); 277418334Speter 277518334Speter mark_set_resources (insn, &set, 0, 1); 277618334Speter } 277718334Speter 277818334Speter AND_HARD_REG_SET (res->regs, new_resources.regs); 277918334Speter } 278018334Speter 278118334Speter COPY_HARD_REG_SET (tinfo->live_regs, res->regs); 278218334Speter} 278318334Speter 278418334Speter/* Scan a function looking for insns that need a delay slot and find insns to 278518334Speter put into the delay slot. 278618334Speter 278718334Speter NON_JUMPS_P is non-zero if we are to only try to fill non-jump insns (such 278818334Speter as calls). We do these first since we don't want jump insns (that are 278918334Speter easier to fill) to get the only insns that could be used for non-jump insns. 279018334Speter When it is zero, only try to fill JUMP_INSNs. 279118334Speter 279218334Speter When slots are filled in this manner, the insns (including the 279318334Speter delay_insn) are put together in a SEQUENCE rtx. In this fashion, 279418334Speter it is possible to tell whether a delay slot has really been filled 279518334Speter or not. `final' knows how to deal with this, by communicating 279618334Speter through FINAL_SEQUENCE. */ 279718334Speter 279818334Speterstatic void 279918334Speterfill_simple_delay_slots (first, non_jumps_p) 280018334Speter rtx first; 280118334Speter int non_jumps_p; 280218334Speter{ 280318334Speter register rtx insn, pat, trial, next_trial; 280418334Speter register int i, j; 280518334Speter int num_unfilled_slots = unfilled_slots_next - unfilled_slots_base; 280618334Speter struct resources needed, set; 280718334Speter register int slots_to_fill, slots_filled; 280818334Speter rtx delay_list; 280918334Speter 281018334Speter for (i = 0; i < num_unfilled_slots; i++) 281118334Speter { 281218334Speter int flags; 281318334Speter /* Get the next insn to fill. If it has already had any slots assigned, 281418334Speter we can't do anything with it. Maybe we'll improve this later. */ 281518334Speter 281618334Speter insn = unfilled_slots_base[i]; 281718334Speter if (insn == 0 281818334Speter || INSN_DELETED_P (insn) 281918334Speter || (GET_CODE (insn) == INSN 282018334Speter && GET_CODE (PATTERN (insn)) == SEQUENCE) 282118334Speter || (GET_CODE (insn) == JUMP_INSN && non_jumps_p) 282218334Speter || (GET_CODE (insn) != JUMP_INSN && ! non_jumps_p)) 282318334Speter continue; 282418334Speter 282518334Speter if (GET_CODE (insn) == JUMP_INSN) 282618334Speter flags = get_jump_flags (insn, JUMP_LABEL (insn)); 282718334Speter else 282818334Speter flags = get_jump_flags (insn, NULL_RTX); 282918334Speter slots_to_fill = num_delay_slots (insn); 283018334Speter if (slots_to_fill == 0) 283118334Speter abort (); 283218334Speter 283318334Speter /* This insn needs, or can use, some delay slots. SLOTS_TO_FILL 283418334Speter says how many. After initialization, first try optimizing 283518334Speter 283618334Speter call _foo call _foo 283718334Speter nop add %o7,.-L1,%o7 283818334Speter b,a L1 283918334Speter nop 284018334Speter 284118334Speter If this case applies, the delay slot of the call is filled with 284218334Speter the unconditional jump. This is done first to avoid having the 284318334Speter delay slot of the call filled in the backward scan. Also, since 284418334Speter the unconditional jump is likely to also have a delay slot, that 284518334Speter insn must exist when it is subsequently scanned. 284618334Speter 284718334Speter This is tried on each insn with delay slots as some machines 284818334Speter have insns which perform calls, but are not represented as 284918334Speter CALL_INSNs. */ 285018334Speter 285118334Speter slots_filled = 0; 285218334Speter delay_list = 0; 285318334Speter 285418334Speter if ((trial = next_active_insn (insn)) 285518334Speter && GET_CODE (trial) == JUMP_INSN 285618334Speter && simplejump_p (trial) 285718334Speter && eligible_for_delay (insn, slots_filled, trial, flags) 285818334Speter && no_labels_between_p (insn, trial)) 285918334Speter { 286018334Speter rtx *tmp; 286118334Speter slots_filled++; 286218334Speter delay_list = add_to_delay_list (trial, delay_list); 286318334Speter 286418334Speter /* TRIAL may have had its delay slot filled, then unfilled. When 286518334Speter the delay slot is unfilled, TRIAL is placed back on the unfilled 286618334Speter slots obstack. Unfortunately, it is placed on the end of the 286718334Speter obstack, not in its original location. Therefore, we must search 286818334Speter from entry i + 1 to the end of the unfilled slots obstack to 286918334Speter try and find TRIAL. */ 287018334Speter tmp = &unfilled_slots_base[i + 1]; 287118334Speter while (*tmp != trial && tmp != unfilled_slots_next) 287218334Speter tmp++; 287318334Speter 287418334Speter /* Remove the unconditional jump from consideration for delay slot 287518334Speter filling and unthread it. */ 287618334Speter if (*tmp == trial) 287718334Speter *tmp = 0; 287818334Speter { 287918334Speter rtx next = NEXT_INSN (trial); 288018334Speter rtx prev = PREV_INSN (trial); 288118334Speter if (prev) 288218334Speter NEXT_INSN (prev) = next; 288318334Speter if (next) 288418334Speter PREV_INSN (next) = prev; 288518334Speter } 288618334Speter } 288718334Speter 288818334Speter /* Now, scan backwards from the insn to search for a potential 288918334Speter delay-slot candidate. Stop searching when a label or jump is hit. 289018334Speter 289118334Speter For each candidate, if it is to go into the delay slot (moved 289218334Speter forward in execution sequence), it must not need or set any resources 289318334Speter that were set by later insns and must not set any resources that 289418334Speter are needed for those insns. 289518334Speter 289618334Speter The delay slot insn itself sets resources unless it is a call 289718334Speter (in which case the called routine, not the insn itself, is doing 289818334Speter the setting). */ 289918334Speter 290018334Speter if (slots_filled < slots_to_fill) 290118334Speter { 290218334Speter CLEAR_RESOURCE (&needed); 290318334Speter CLEAR_RESOURCE (&set); 290418334Speter mark_set_resources (insn, &set, 0, 0); 290518334Speter mark_referenced_resources (insn, &needed, 0); 290618334Speter 290718334Speter for (trial = prev_nonnote_insn (insn); ! stop_search_p (trial, 1); 290818334Speter trial = next_trial) 290918334Speter { 291018334Speter next_trial = prev_nonnote_insn (trial); 291118334Speter 291218334Speter /* This must be an INSN or CALL_INSN. */ 291318334Speter pat = PATTERN (trial); 291418334Speter 291518334Speter /* USE and CLOBBER at this level was just for flow; ignore it. */ 291618334Speter if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER) 291718334Speter continue; 291818334Speter 291918334Speter /* Check for resource conflict first, to avoid unnecessary 292018334Speter splitting. */ 292118334Speter if (! insn_references_resource_p (trial, &set, 1) 292218334Speter && ! insn_sets_resource_p (trial, &set, 1) 292318334Speter && ! insn_sets_resource_p (trial, &needed, 1) 292418334Speter#ifdef HAVE_cc0 292518334Speter /* Can't separate set of cc0 from its use. */ 292618334Speter && ! (reg_mentioned_p (cc0_rtx, pat) 292718334Speter && ! sets_cc0_p (cc0_rtx, pat)) 292818334Speter#endif 292918334Speter ) 293018334Speter { 293118334Speter trial = try_split (pat, trial, 1); 293218334Speter next_trial = prev_nonnote_insn (trial); 293318334Speter if (eligible_for_delay (insn, slots_filled, trial, flags)) 293418334Speter { 293518334Speter /* In this case, we are searching backward, so if we 293618334Speter find insns to put on the delay list, we want 293718334Speter to put them at the head, rather than the 293818334Speter tail, of the list. */ 293918334Speter 294018334Speter update_reg_dead_notes (trial, insn); 294118334Speter delay_list = gen_rtx (INSN_LIST, VOIDmode, 294218334Speter trial, delay_list); 294318334Speter update_block (trial, trial); 294418334Speter delete_insn (trial); 294518334Speter if (slots_to_fill == ++slots_filled) 294618334Speter break; 294718334Speter continue; 294818334Speter } 294918334Speter } 295018334Speter 295118334Speter mark_set_resources (trial, &set, 0, 1); 295218334Speter mark_referenced_resources (trial, &needed, 1); 295318334Speter } 295418334Speter } 295518334Speter 295618334Speter /* If all needed slots haven't been filled, we come here. */ 295718334Speter 295818334Speter /* Try to optimize case of jumping around a single insn. */ 295918334Speter#if defined(ANNUL_IFFALSE_SLOTS) || defined(ANNUL_IFTRUE_SLOTS) 296018334Speter if (slots_filled != slots_to_fill 296118334Speter && delay_list == 0 296218334Speter && GET_CODE (insn) == JUMP_INSN 296318334Speter && (condjump_p (insn) || condjump_in_parallel_p (insn))) 296418334Speter { 296518334Speter delay_list = optimize_skip (insn); 296618334Speter if (delay_list) 296718334Speter slots_filled += 1; 296818334Speter } 296918334Speter#endif 297018334Speter 297118334Speter /* Try to get insns from beyond the insn needing the delay slot. 297218334Speter These insns can neither set or reference resources set in insns being 297318334Speter skipped, cannot set resources in the insn being skipped, and, if this 297418334Speter is a CALL_INSN (or a CALL_INSN is passed), cannot trap (because the 297518334Speter call might not return). 297618334Speter 297718334Speter There used to be code which continued past the target label if 297818334Speter we saw all uses of the target label. This code did not work, 297918334Speter because it failed to account for some instructions which were 298018334Speter both annulled and marked as from the target. This can happen as a 298118334Speter result of optimize_skip. Since this code was redundant with 298218334Speter fill_eager_delay_slots anyways, it was just deleted. */ 298318334Speter 298418334Speter if (slots_filled != slots_to_fill 298518334Speter && (GET_CODE (insn) != JUMP_INSN 298618334Speter || ((condjump_p (insn) || condjump_in_parallel_p (insn)) 298718334Speter && ! simplejump_p (insn) 298818334Speter && JUMP_LABEL (insn) != 0))) 298918334Speter { 299018334Speter rtx target = 0; 299118334Speter int maybe_never = 0; 299218334Speter struct resources needed_at_jump; 299318334Speter 299418334Speter CLEAR_RESOURCE (&needed); 299518334Speter CLEAR_RESOURCE (&set); 299618334Speter 299718334Speter if (GET_CODE (insn) == CALL_INSN) 299818334Speter { 299918334Speter mark_set_resources (insn, &set, 0, 1); 300018334Speter mark_referenced_resources (insn, &needed, 1); 300118334Speter maybe_never = 1; 300218334Speter } 300318334Speter else 300418334Speter { 300518334Speter mark_set_resources (insn, &set, 0, 1); 300618334Speter mark_referenced_resources (insn, &needed, 1); 300718334Speter if (GET_CODE (insn) == JUMP_INSN) 300818334Speter target = JUMP_LABEL (insn); 300918334Speter } 301018334Speter 301118334Speter for (trial = next_nonnote_insn (insn); trial; trial = next_trial) 301218334Speter { 301318334Speter rtx pat, trial_delay; 301418334Speter 301518334Speter next_trial = next_nonnote_insn (trial); 301618334Speter 301718334Speter if (GET_CODE (trial) == CODE_LABEL 301818334Speter || GET_CODE (trial) == BARRIER) 301918334Speter break; 302018334Speter 302118334Speter /* We must have an INSN, JUMP_INSN, or CALL_INSN. */ 302218334Speter pat = PATTERN (trial); 302318334Speter 302418334Speter /* Stand-alone USE and CLOBBER are just for flow. */ 302518334Speter if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER) 302618334Speter continue; 302718334Speter 302818334Speter /* If this already has filled delay slots, get the insn needing 302918334Speter the delay slots. */ 303018334Speter if (GET_CODE (pat) == SEQUENCE) 303118334Speter trial_delay = XVECEXP (pat, 0, 0); 303218334Speter else 303318334Speter trial_delay = trial; 303418334Speter 303518334Speter /* If this is a jump insn to our target, indicate that we have 303618334Speter seen another jump to it. If we aren't handling a conditional 303718334Speter jump, stop our search. Otherwise, compute the needs at its 303818334Speter target and add them to NEEDED. */ 303918334Speter if (GET_CODE (trial_delay) == JUMP_INSN) 304018334Speter { 304118334Speter if (target == 0) 304218334Speter break; 304318334Speter else if (JUMP_LABEL (trial_delay) != target) 304418334Speter { 304518334Speter mark_target_live_regs 304618334Speter (next_active_insn (JUMP_LABEL (trial_delay)), 304718334Speter &needed_at_jump); 304818334Speter needed.memory |= needed_at_jump.memory; 304918334Speter needed.unch_memory |= needed_at_jump.unch_memory; 305018334Speter IOR_HARD_REG_SET (needed.regs, needed_at_jump.regs); 305118334Speter } 305218334Speter } 305318334Speter 305418334Speter /* See if we have a resource problem before we try to 305518334Speter split. */ 305618334Speter if (target == 0 305718334Speter && GET_CODE (pat) != SEQUENCE 305818334Speter && ! insn_references_resource_p (trial, &set, 1) 305918334Speter && ! insn_sets_resource_p (trial, &set, 1) 306018334Speter && ! insn_sets_resource_p (trial, &needed, 1) 306118334Speter#ifdef HAVE_cc0 306218334Speter && ! (reg_mentioned_p (cc0_rtx, pat) && ! sets_cc0_p (pat)) 306318334Speter#endif 306418334Speter && ! (maybe_never && may_trap_p (pat)) 306518334Speter && (trial = try_split (pat, trial, 0)) 306618334Speter && eligible_for_delay (insn, slots_filled, trial, flags)) 306718334Speter { 306818334Speter next_trial = next_nonnote_insn (trial); 306918334Speter delay_list = add_to_delay_list (trial, delay_list); 307018334Speter 307118334Speter#ifdef HAVE_cc0 307218334Speter if (reg_mentioned_p (cc0_rtx, pat)) 307318334Speter link_cc0_insns (trial); 307418334Speter#endif 307518334Speter 307618334Speter delete_insn (trial); 307718334Speter if (slots_to_fill == ++slots_filled) 307818334Speter break; 307918334Speter continue; 308018334Speter } 308118334Speter 308218334Speter mark_set_resources (trial, &set, 0, 1); 308318334Speter mark_referenced_resources (trial, &needed, 1); 308418334Speter 308518334Speter /* Ensure we don't put insns between the setting of cc and the 308618334Speter comparison by moving a setting of cc into an earlier delay 308718334Speter slot since these insns could clobber the condition code. */ 308818334Speter set.cc = 1; 308918334Speter 309018334Speter /* If this is a call or jump, we might not get here. */ 309118334Speter if (GET_CODE (trial_delay) == CALL_INSN 309218334Speter || GET_CODE (trial_delay) == JUMP_INSN) 309318334Speter maybe_never = 1; 309418334Speter } 309518334Speter 309618334Speter /* If there are slots left to fill and our search was stopped by an 309718334Speter unconditional branch, try the insn at the branch target. We can 309818334Speter redirect the branch if it works. 309918334Speter 310018334Speter Don't do this if the insn at the branch target is a branch. */ 310118334Speter if (slots_to_fill != slots_filled 310218334Speter && trial 310318334Speter && GET_CODE (trial) == JUMP_INSN 310418334Speter && simplejump_p (trial) 310518334Speter && (target == 0 || JUMP_LABEL (trial) == target) 310618334Speter && (next_trial = next_active_insn (JUMP_LABEL (trial))) != 0 310718334Speter && ! (GET_CODE (next_trial) == INSN 310818334Speter && GET_CODE (PATTERN (next_trial)) == SEQUENCE) 310918334Speter && GET_CODE (next_trial) != JUMP_INSN 311018334Speter && ! insn_references_resource_p (next_trial, &set, 1) 311118334Speter && ! insn_sets_resource_p (next_trial, &set, 1) 311218334Speter && ! insn_sets_resource_p (next_trial, &needed, 1) 311318334Speter#ifdef HAVE_cc0 311418334Speter && ! reg_mentioned_p (cc0_rtx, PATTERN (next_trial)) 311518334Speter#endif 311618334Speter && ! (maybe_never && may_trap_p (PATTERN (next_trial))) 311718334Speter && (next_trial = try_split (PATTERN (next_trial), next_trial, 0)) 311818334Speter && eligible_for_delay (insn, slots_filled, next_trial, flags)) 311918334Speter { 312018334Speter rtx new_label = next_active_insn (next_trial); 312118334Speter 312218334Speter if (new_label != 0) 312318334Speter new_label = get_label_before (new_label); 312418334Speter else 312518334Speter new_label = find_end_label (); 312618334Speter 312718334Speter delay_list 312818334Speter = add_to_delay_list (copy_rtx (next_trial), delay_list); 312918334Speter slots_filled++; 313018334Speter reorg_redirect_jump (trial, new_label); 313118334Speter 313218334Speter /* If we merged because we both jumped to the same place, 313318334Speter redirect the original insn also. */ 313418334Speter if (target) 313518334Speter reorg_redirect_jump (insn, new_label); 313618334Speter } 313718334Speter } 313818334Speter 313918334Speter if (delay_list) 314018334Speter unfilled_slots_base[i] 314118334Speter = emit_delay_sequence (insn, delay_list, 314218334Speter slots_filled, slots_to_fill); 314318334Speter 314418334Speter if (slots_to_fill == slots_filled) 314518334Speter unfilled_slots_base[i] = 0; 314618334Speter 314718334Speter note_delay_statistics (slots_filled, 0); 314818334Speter } 314918334Speter 315018334Speter#ifdef DELAY_SLOTS_FOR_EPILOGUE 315118334Speter /* See if the epilogue needs any delay slots. Try to fill them if so. 315218334Speter The only thing we can do is scan backwards from the end of the 315318334Speter function. If we did this in a previous pass, it is incorrect to do it 315418334Speter again. */ 315518334Speter if (current_function_epilogue_delay_list) 315618334Speter return; 315718334Speter 315818334Speter slots_to_fill = DELAY_SLOTS_FOR_EPILOGUE; 315918334Speter if (slots_to_fill == 0) 316018334Speter return; 316118334Speter 316218334Speter slots_filled = 0; 316318334Speter CLEAR_RESOURCE (&set); 316418334Speter 316518334Speter /* The frame pointer and stack pointer are needed at the beginning of 316618334Speter the epilogue, so instructions setting them can not be put in the 316718334Speter epilogue delay slot. However, everything else needed at function 316818334Speter end is safe, so we don't want to use end_of_function_needs here. */ 316918334Speter CLEAR_RESOURCE (&needed); 317018334Speter if (frame_pointer_needed) 317118334Speter { 317218334Speter SET_HARD_REG_BIT (needed.regs, FRAME_POINTER_REGNUM); 317318334Speter#if HARD_FRAME_POINTER_REGNUM != FRAME_POINTER_REGNUM 317418334Speter SET_HARD_REG_BIT (needed.regs, HARD_FRAME_POINTER_REGNUM); 317518334Speter#endif 317618334Speter#ifdef EXIT_IGNORE_STACK 317718334Speter if (! EXIT_IGNORE_STACK) 317818334Speter#endif 317918334Speter SET_HARD_REG_BIT (needed.regs, STACK_POINTER_REGNUM); 318018334Speter } 318118334Speter else 318218334Speter SET_HARD_REG_BIT (needed.regs, STACK_POINTER_REGNUM); 318318334Speter 318418334Speter for (trial = get_last_insn (); ! stop_search_p (trial, 1); 318518334Speter trial = PREV_INSN (trial)) 318618334Speter { 318718334Speter if (GET_CODE (trial) == NOTE) 318818334Speter continue; 318918334Speter pat = PATTERN (trial); 319018334Speter if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER) 319118334Speter continue; 319218334Speter 319318334Speter if (! insn_references_resource_p (trial, &set, 1) 319418334Speter && ! insn_sets_resource_p (trial, &needed, 1) 319518334Speter && ! insn_sets_resource_p (trial, &set, 1) 319618334Speter#ifdef HAVE_cc0 319718334Speter /* Don't want to mess with cc0 here. */ 319818334Speter && ! reg_mentioned_p (cc0_rtx, pat) 319918334Speter#endif 320018334Speter ) 320118334Speter { 320218334Speter trial = try_split (pat, trial, 1); 320318334Speter if (ELIGIBLE_FOR_EPILOGUE_DELAY (trial, slots_filled)) 320418334Speter { 320518334Speter /* Here as well we are searching backward, so put the 320618334Speter insns we find on the head of the list. */ 320718334Speter 320818334Speter current_function_epilogue_delay_list 320918334Speter = gen_rtx (INSN_LIST, VOIDmode, trial, 321018334Speter current_function_epilogue_delay_list); 321118334Speter mark_referenced_resources (trial, &end_of_function_needs, 1); 321218334Speter update_block (trial, trial); 321318334Speter delete_insn (trial); 321418334Speter 321518334Speter /* Clear deleted bit so final.c will output the insn. */ 321618334Speter INSN_DELETED_P (trial) = 0; 321718334Speter 321818334Speter if (slots_to_fill == ++slots_filled) 321918334Speter break; 322018334Speter continue; 322118334Speter } 322218334Speter } 322318334Speter 322418334Speter mark_set_resources (trial, &set, 0, 1); 322518334Speter mark_referenced_resources (trial, &needed, 1); 322618334Speter } 322718334Speter 322818334Speter note_delay_statistics (slots_filled, 0); 322918334Speter#endif 323018334Speter} 323118334Speter 323218334Speter/* Try to find insns to place in delay slots. 323318334Speter 323418334Speter INSN is the jump needing SLOTS_TO_FILL delay slots. It tests CONDITION 323518334Speter or is an unconditional branch if CONDITION is const_true_rtx. 323618334Speter *PSLOTS_FILLED is updated with the number of slots that we have filled. 323718334Speter 323818334Speter THREAD is a flow-of-control, either the insns to be executed if the 323918334Speter branch is true or if the branch is false, THREAD_IF_TRUE says which. 324018334Speter 324118334Speter OPPOSITE_THREAD is the thread in the opposite direction. It is used 324218334Speter to see if any potential delay slot insns set things needed there. 324318334Speter 324418334Speter LIKELY is non-zero if it is extremely likely that the branch will be 324518334Speter taken and THREAD_IF_TRUE is set. This is used for the branch at the 324618334Speter end of a loop back up to the top. 324718334Speter 324818334Speter OWN_THREAD and OWN_OPPOSITE_THREAD are true if we are the only user of the 324918334Speter thread. I.e., it is the fallthrough code of our jump or the target of the 325018334Speter jump when we are the only jump going there. 325118334Speter 325218334Speter If OWN_THREAD is false, it must be the "true" thread of a jump. In that 325318334Speter case, we can only take insns from the head of the thread for our delay 325418334Speter slot. We then adjust the jump to point after the insns we have taken. */ 325518334Speter 325618334Speterstatic rtx 325718334Speterfill_slots_from_thread (insn, condition, thread, opposite_thread, likely, 325818334Speter thread_if_true, own_thread, own_opposite_thread, 325918334Speter slots_to_fill, pslots_filled) 326018334Speter rtx insn; 326118334Speter rtx condition; 326218334Speter rtx thread, opposite_thread; 326318334Speter int likely; 326418334Speter int thread_if_true; 326518334Speter int own_thread, own_opposite_thread; 326618334Speter int slots_to_fill, *pslots_filled; 326718334Speter{ 326818334Speter rtx new_thread; 326918334Speter rtx delay_list = 0; 327018334Speter struct resources opposite_needed, set, needed; 327118334Speter rtx trial; 327218334Speter int lose = 0; 327318334Speter int must_annul = 0; 327418334Speter int flags; 327518334Speter 327618334Speter /* Validate our arguments. */ 327718334Speter if ((condition == const_true_rtx && ! thread_if_true) 327818334Speter || (! own_thread && ! thread_if_true)) 327918334Speter abort (); 328018334Speter 328118334Speter flags = get_jump_flags (insn, JUMP_LABEL (insn)); 328218334Speter 328318334Speter /* If our thread is the end of subroutine, we can't get any delay 328418334Speter insns from that. */ 328518334Speter if (thread == 0) 328618334Speter return 0; 328718334Speter 328818334Speter /* If this is an unconditional branch, nothing is needed at the 328918334Speter opposite thread. Otherwise, compute what is needed there. */ 329018334Speter if (condition == const_true_rtx) 329118334Speter CLEAR_RESOURCE (&opposite_needed); 329218334Speter else 329318334Speter mark_target_live_regs (opposite_thread, &opposite_needed); 329418334Speter 329518334Speter /* If the insn at THREAD can be split, do it here to avoid having to 329618334Speter update THREAD and NEW_THREAD if it is done in the loop below. Also 329718334Speter initialize NEW_THREAD. */ 329818334Speter 329918334Speter new_thread = thread = try_split (PATTERN (thread), thread, 0); 330018334Speter 330118334Speter /* Scan insns at THREAD. We are looking for an insn that can be removed 330218334Speter from THREAD (it neither sets nor references resources that were set 330318334Speter ahead of it and it doesn't set anything needs by the insns ahead of 330418334Speter it) and that either can be placed in an annulling insn or aren't 330518334Speter needed at OPPOSITE_THREAD. */ 330618334Speter 330718334Speter CLEAR_RESOURCE (&needed); 330818334Speter CLEAR_RESOURCE (&set); 330918334Speter 331018334Speter /* If we do not own this thread, we must stop as soon as we find 331118334Speter something that we can't put in a delay slot, since all we can do 331218334Speter is branch into THREAD at a later point. Therefore, labels stop 331318334Speter the search if this is not the `true' thread. */ 331418334Speter 331518334Speter for (trial = thread; 331618334Speter ! stop_search_p (trial, ! thread_if_true) && (! lose || own_thread); 331718334Speter trial = next_nonnote_insn (trial)) 331818334Speter { 331918334Speter rtx pat, old_trial; 332018334Speter 332118334Speter /* If we have passed a label, we no longer own this thread. */ 332218334Speter if (GET_CODE (trial) == CODE_LABEL) 332318334Speter { 332418334Speter own_thread = 0; 332518334Speter continue; 332618334Speter } 332718334Speter 332818334Speter pat = PATTERN (trial); 332918334Speter if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER) 333018334Speter continue; 333118334Speter 333218334Speter /* If TRIAL conflicts with the insns ahead of it, we lose. Also, 333318334Speter don't separate or copy insns that set and use CC0. */ 333418334Speter if (! insn_references_resource_p (trial, &set, 1) 333518334Speter && ! insn_sets_resource_p (trial, &set, 1) 333618334Speter && ! insn_sets_resource_p (trial, &needed, 1) 333718334Speter#ifdef HAVE_cc0 333818334Speter && ! (reg_mentioned_p (cc0_rtx, pat) 333918334Speter && (! own_thread || ! sets_cc0_p (pat))) 334018334Speter#endif 334118334Speter ) 334218334Speter { 334318334Speter rtx prior_insn; 334418334Speter 334518334Speter /* If TRIAL is redundant with some insn before INSN, we don't 334618334Speter actually need to add it to the delay list; we can merely pretend 334718334Speter we did. */ 334818334Speter if (prior_insn = redundant_insn (trial, insn, delay_list)) 334918334Speter { 335018334Speter if (own_thread) 335118334Speter { 335218334Speter update_block (trial, thread); 335318334Speter if (trial == thread) 335418334Speter { 335518334Speter thread = next_active_insn (thread); 335618334Speter if (new_thread == trial) 335718334Speter new_thread = thread; 335818334Speter } 335918334Speter 336018334Speter delete_insn (trial); 336118334Speter } 336218334Speter else 336318334Speter { 336418334Speter update_reg_unused_notes (prior_insn, trial); 336518334Speter new_thread = next_active_insn (trial); 336618334Speter } 336718334Speter 336818334Speter continue; 336918334Speter } 337018334Speter 337118334Speter /* There are two ways we can win: If TRIAL doesn't set anything 337218334Speter needed at the opposite thread and can't trap, or if it can 337318334Speter go into an annulled delay slot. */ 337418334Speter if (condition == const_true_rtx 337518334Speter || (! insn_sets_resource_p (trial, &opposite_needed, 1) 337618334Speter && ! may_trap_p (pat))) 337718334Speter { 337818334Speter old_trial = trial; 337918334Speter trial = try_split (pat, trial, 0); 338018334Speter if (new_thread == old_trial) 338118334Speter new_thread = trial; 338218334Speter if (thread == old_trial) 338318334Speter thread = trial; 338418334Speter pat = PATTERN (trial); 338518334Speter if (eligible_for_delay (insn, *pslots_filled, trial, flags)) 338618334Speter goto winner; 338718334Speter } 338818334Speter else if (0 338918334Speter#ifdef ANNUL_IFTRUE_SLOTS 339018334Speter || ! thread_if_true 339118334Speter#endif 339218334Speter#ifdef ANNUL_IFFALSE_SLOTS 339318334Speter || thread_if_true 339418334Speter#endif 339518334Speter ) 339618334Speter { 339718334Speter old_trial = trial; 339818334Speter trial = try_split (pat, trial, 0); 339918334Speter if (new_thread == old_trial) 340018334Speter new_thread = trial; 340118334Speter if (thread == old_trial) 340218334Speter thread = trial; 340318334Speter pat = PATTERN (trial); 340418334Speter if ((thread_if_true 340518334Speter ? eligible_for_annul_false (insn, *pslots_filled, trial, flags) 340618334Speter : eligible_for_annul_true (insn, *pslots_filled, trial, flags))) 340718334Speter { 340818334Speter rtx temp; 340918334Speter 341018334Speter must_annul = 1; 341118334Speter winner: 341218334Speter 341318334Speter#ifdef HAVE_cc0 341418334Speter if (reg_mentioned_p (cc0_rtx, pat)) 341518334Speter link_cc0_insns (trial); 341618334Speter#endif 341718334Speter 341818334Speter /* If we own this thread, delete the insn. If this is the 341918334Speter destination of a branch, show that a basic block status 342018334Speter may have been updated. In any case, mark the new 342118334Speter starting point of this thread. */ 342218334Speter if (own_thread) 342318334Speter { 342418334Speter update_block (trial, thread); 342518334Speter delete_insn (trial); 342618334Speter } 342718334Speter else 342818334Speter new_thread = next_active_insn (trial); 342918334Speter 343018334Speter temp = own_thread ? trial : copy_rtx (trial); 343118334Speter if (thread_if_true) 343218334Speter INSN_FROM_TARGET_P (temp) = 1; 343318334Speter 343418334Speter delay_list = add_to_delay_list (temp, delay_list); 343518334Speter 343618334Speter if (slots_to_fill == ++(*pslots_filled)) 343718334Speter { 343818334Speter /* Even though we have filled all the slots, we 343918334Speter may be branching to a location that has a 344018334Speter redundant insn. Skip any if so. */ 344118334Speter while (new_thread && ! own_thread 344218334Speter && ! insn_sets_resource_p (new_thread, &set, 1) 344318334Speter && ! insn_sets_resource_p (new_thread, &needed, 1) 344418334Speter && ! insn_references_resource_p (new_thread, 344518334Speter &set, 1) 344618334Speter && redundant_insn (new_thread, insn, delay_list)) 344718334Speter new_thread = next_active_insn (new_thread); 344818334Speter break; 344918334Speter } 345018334Speter 345118334Speter continue; 345218334Speter } 345318334Speter } 345418334Speter } 345518334Speter 345618334Speter /* This insn can't go into a delay slot. */ 345718334Speter lose = 1; 345818334Speter mark_set_resources (trial, &set, 0, 1); 345918334Speter mark_referenced_resources (trial, &needed, 1); 346018334Speter 346118334Speter /* Ensure we don't put insns between the setting of cc and the comparison 346218334Speter by moving a setting of cc into an earlier delay slot since these insns 346318334Speter could clobber the condition code. */ 346418334Speter set.cc = 1; 346518334Speter 346618334Speter /* If this insn is a register-register copy and the next insn has 346718334Speter a use of our destination, change it to use our source. That way, 346818334Speter it will become a candidate for our delay slot the next time 346918334Speter through this loop. This case occurs commonly in loops that 347018334Speter scan a list. 347118334Speter 347218334Speter We could check for more complex cases than those tested below, 347318334Speter but it doesn't seem worth it. It might also be a good idea to try 347418334Speter to swap the two insns. That might do better. 347518334Speter 347618334Speter We can't do this if the next insn modifies our destination, because 347718334Speter that would make the replacement into the insn invalid. We also can't 347818334Speter do this if it modifies our source, because it might be an earlyclobber 347918334Speter operand. This latter test also prevents updating the contents of 348018334Speter a PRE_INC. */ 348118334Speter 348218334Speter if (GET_CODE (trial) == INSN && GET_CODE (pat) == SET 348318334Speter && GET_CODE (SET_SRC (pat)) == REG 348418334Speter && GET_CODE (SET_DEST (pat)) == REG) 348518334Speter { 348618334Speter rtx next = next_nonnote_insn (trial); 348718334Speter 348818334Speter if (next && GET_CODE (next) == INSN 348918334Speter && GET_CODE (PATTERN (next)) != USE 349018334Speter && ! reg_set_p (SET_DEST (pat), next) 349118334Speter && ! reg_set_p (SET_SRC (pat), next) 349218334Speter && reg_referenced_p (SET_DEST (pat), PATTERN (next))) 349318334Speter validate_replace_rtx (SET_DEST (pat), SET_SRC (pat), next); 349418334Speter } 349518334Speter } 349618334Speter 349718334Speter /* If we stopped on a branch insn that has delay slots, see if we can 349818334Speter steal some of the insns in those slots. */ 349918334Speter if (trial && GET_CODE (trial) == INSN 350018334Speter && GET_CODE (PATTERN (trial)) == SEQUENCE 350118334Speter && GET_CODE (XVECEXP (PATTERN (trial), 0, 0)) == JUMP_INSN) 350218334Speter { 350318334Speter /* If this is the `true' thread, we will want to follow the jump, 350418334Speter so we can only do this if we have taken everything up to here. */ 350518334Speter if (thread_if_true && trial == new_thread) 350618334Speter delay_list 350718334Speter = steal_delay_list_from_target (insn, condition, PATTERN (trial), 350818334Speter delay_list, &set, &needed, 350918334Speter &opposite_needed, slots_to_fill, 351018334Speter pslots_filled, &must_annul, 351118334Speter &new_thread); 351218334Speter else if (! thread_if_true) 351318334Speter delay_list 351418334Speter = steal_delay_list_from_fallthrough (insn, condition, 351518334Speter PATTERN (trial), 351618334Speter delay_list, &set, &needed, 351718334Speter &opposite_needed, slots_to_fill, 351818334Speter pslots_filled, &must_annul); 351918334Speter } 352018334Speter 352118334Speter /* If we haven't found anything for this delay slot and it is very 352218334Speter likely that the branch will be taken, see if the insn at our target 352318334Speter increments or decrements a register with an increment that does not 352418334Speter depend on the destination register. If so, try to place the opposite 352518334Speter arithmetic insn after the jump insn and put the arithmetic insn in the 352618334Speter delay slot. If we can't do this, return. */ 352718334Speter if (delay_list == 0 && likely && new_thread && GET_CODE (new_thread) == INSN) 352818334Speter { 352918334Speter rtx pat = PATTERN (new_thread); 353018334Speter rtx dest; 353118334Speter rtx src; 353218334Speter 353318334Speter trial = new_thread; 353418334Speter pat = PATTERN (trial); 353518334Speter 353618334Speter if (GET_CODE (trial) != INSN || GET_CODE (pat) != SET 353718334Speter || ! eligible_for_delay (insn, 0, trial, flags)) 353818334Speter return 0; 353918334Speter 354018334Speter dest = SET_DEST (pat), src = SET_SRC (pat); 354118334Speter if ((GET_CODE (src) == PLUS || GET_CODE (src) == MINUS) 354218334Speter && rtx_equal_p (XEXP (src, 0), dest) 354318334Speter && ! reg_overlap_mentioned_p (dest, XEXP (src, 1))) 354418334Speter { 354518334Speter rtx other = XEXP (src, 1); 354618334Speter rtx new_arith; 354718334Speter rtx ninsn; 354818334Speter 354918334Speter /* If this is a constant adjustment, use the same code with 355018334Speter the negated constant. Otherwise, reverse the sense of the 355118334Speter arithmetic. */ 355218334Speter if (GET_CODE (other) == CONST_INT) 355318334Speter new_arith = gen_rtx (GET_CODE (src), GET_MODE (src), dest, 355418334Speter negate_rtx (GET_MODE (src), other)); 355518334Speter else 355618334Speter new_arith = gen_rtx (GET_CODE (src) == PLUS ? MINUS : PLUS, 355718334Speter GET_MODE (src), dest, other); 355818334Speter 355918334Speter ninsn = emit_insn_after (gen_rtx (SET, VOIDmode, dest, new_arith), 356018334Speter insn); 356118334Speter 356218334Speter if (recog_memoized (ninsn) < 0 356318334Speter || (insn_extract (ninsn), 356418334Speter ! constrain_operands (INSN_CODE (ninsn), 1))) 356518334Speter { 356618334Speter delete_insn (ninsn); 356718334Speter return 0; 356818334Speter } 356918334Speter 357018334Speter if (own_thread) 357118334Speter { 357218334Speter update_block (trial, thread); 357318334Speter delete_insn (trial); 357418334Speter } 357518334Speter else 357618334Speter new_thread = next_active_insn (trial); 357718334Speter 357818334Speter ninsn = own_thread ? trial : copy_rtx (trial); 357918334Speter if (thread_if_true) 358018334Speter INSN_FROM_TARGET_P (ninsn) = 1; 358118334Speter 358218334Speter delay_list = add_to_delay_list (ninsn, NULL_RTX); 358318334Speter (*pslots_filled)++; 358418334Speter } 358518334Speter } 358618334Speter 358718334Speter if (delay_list && must_annul) 358818334Speter INSN_ANNULLED_BRANCH_P (insn) = 1; 358918334Speter 359018334Speter /* If we are to branch into the middle of this thread, find an appropriate 359118334Speter label or make a new one if none, and redirect INSN to it. If we hit the 359218334Speter end of the function, use the end-of-function label. */ 359318334Speter if (new_thread != thread) 359418334Speter { 359518334Speter rtx label; 359618334Speter 359718334Speter if (! thread_if_true) 359818334Speter abort (); 359918334Speter 360018334Speter if (new_thread && GET_CODE (new_thread) == JUMP_INSN 360118334Speter && (simplejump_p (new_thread) 360218334Speter || GET_CODE (PATTERN (new_thread)) == RETURN) 360318334Speter && redirect_with_delay_list_safe_p (insn, 360418334Speter JUMP_LABEL (new_thread), 360518334Speter delay_list)) 360618334Speter new_thread = follow_jumps (JUMP_LABEL (new_thread)); 360718334Speter 360818334Speter if (new_thread == 0) 360918334Speter label = find_end_label (); 361018334Speter else if (GET_CODE (new_thread) == CODE_LABEL) 361118334Speter label = new_thread; 361218334Speter else 361318334Speter label = get_label_before (new_thread); 361418334Speter 361518334Speter reorg_redirect_jump (insn, label); 361618334Speter } 361718334Speter 361818334Speter return delay_list; 361918334Speter} 362018334Speter 362118334Speter/* Make another attempt to find insns to place in delay slots. 362218334Speter 362318334Speter We previously looked for insns located in front of the delay insn 362418334Speter and, for non-jump delay insns, located behind the delay insn. 362518334Speter 362618334Speter Here only try to schedule jump insns and try to move insns from either 362718334Speter the target or the following insns into the delay slot. If annulling is 362818334Speter supported, we will be likely to do this. Otherwise, we can do this only 362918334Speter if safe. */ 363018334Speter 363118334Speterstatic void 363218334Speterfill_eager_delay_slots (first) 363318334Speter rtx first; 363418334Speter{ 363518334Speter register rtx insn; 363618334Speter register int i; 363718334Speter int num_unfilled_slots = unfilled_slots_next - unfilled_slots_base; 363818334Speter 363918334Speter for (i = 0; i < num_unfilled_slots; i++) 364018334Speter { 364118334Speter rtx condition; 364218334Speter rtx target_label, insn_at_target, fallthrough_insn; 364318334Speter rtx delay_list = 0; 364418334Speter int own_target; 364518334Speter int own_fallthrough; 364618334Speter int prediction, slots_to_fill, slots_filled; 364718334Speter 364818334Speter insn = unfilled_slots_base[i]; 364918334Speter if (insn == 0 365018334Speter || INSN_DELETED_P (insn) 365118334Speter || GET_CODE (insn) != JUMP_INSN 365218334Speter || ! (condjump_p (insn) || condjump_in_parallel_p (insn))) 365318334Speter continue; 365418334Speter 365518334Speter slots_to_fill = num_delay_slots (insn); 365618334Speter if (slots_to_fill == 0) 365718334Speter abort (); 365818334Speter 365918334Speter slots_filled = 0; 366018334Speter target_label = JUMP_LABEL (insn); 366118334Speter condition = get_branch_condition (insn, target_label); 366218334Speter 366318334Speter if (condition == 0) 366418334Speter continue; 366518334Speter 366618334Speter /* Get the next active fallthrough and target insns and see if we own 366718334Speter them. Then see whether the branch is likely true. We don't need 366818334Speter to do a lot of this for unconditional branches. */ 366918334Speter 367018334Speter insn_at_target = next_active_insn (target_label); 367118334Speter own_target = own_thread_p (target_label, target_label, 0); 367218334Speter 367318334Speter if (condition == const_true_rtx) 367418334Speter { 367518334Speter own_fallthrough = 0; 367618334Speter fallthrough_insn = 0; 367718334Speter prediction = 2; 367818334Speter } 367918334Speter else 368018334Speter { 368118334Speter fallthrough_insn = next_active_insn (insn); 368218334Speter own_fallthrough = own_thread_p (NEXT_INSN (insn), NULL_RTX, 1); 368318334Speter prediction = mostly_true_jump (insn, condition); 368418334Speter } 368518334Speter 368618334Speter /* If this insn is expected to branch, first try to get insns from our 368718334Speter target, then our fallthrough insns. If it is not, expected to branch, 368818334Speter try the other order. */ 368918334Speter 369018334Speter if (prediction > 0) 369118334Speter { 369218334Speter delay_list 369318334Speter = fill_slots_from_thread (insn, condition, insn_at_target, 369418334Speter fallthrough_insn, prediction == 2, 1, 369518334Speter own_target, own_fallthrough, 369618334Speter slots_to_fill, &slots_filled); 369718334Speter 369818334Speter if (delay_list == 0 && own_fallthrough) 369918334Speter { 370018334Speter /* Even though we didn't find anything for delay slots, 370118334Speter we might have found a redundant insn which we deleted 370218334Speter from the thread that was filled. So we have to recompute 370318334Speter the next insn at the target. */ 370418334Speter target_label = JUMP_LABEL (insn); 370518334Speter insn_at_target = next_active_insn (target_label); 370618334Speter 370718334Speter delay_list 370818334Speter = fill_slots_from_thread (insn, condition, fallthrough_insn, 370918334Speter insn_at_target, 0, 0, 371018334Speter own_fallthrough, own_target, 371118334Speter slots_to_fill, &slots_filled); 371218334Speter } 371318334Speter } 371418334Speter else 371518334Speter { 371618334Speter if (own_fallthrough) 371718334Speter delay_list 371818334Speter = fill_slots_from_thread (insn, condition, fallthrough_insn, 371918334Speter insn_at_target, 0, 0, 372018334Speter own_fallthrough, own_target, 372118334Speter slots_to_fill, &slots_filled); 372218334Speter 372318334Speter if (delay_list == 0) 372418334Speter delay_list 372518334Speter = fill_slots_from_thread (insn, condition, insn_at_target, 372618334Speter next_active_insn (insn), 0, 1, 372718334Speter own_target, own_fallthrough, 372818334Speter slots_to_fill, &slots_filled); 372918334Speter } 373018334Speter 373118334Speter if (delay_list) 373218334Speter unfilled_slots_base[i] 373318334Speter = emit_delay_sequence (insn, delay_list, 373418334Speter slots_filled, slots_to_fill); 373518334Speter 373618334Speter if (slots_to_fill == slots_filled) 373718334Speter unfilled_slots_base[i] = 0; 373818334Speter 373918334Speter note_delay_statistics (slots_filled, 1); 374018334Speter } 374118334Speter} 374218334Speter 374318334Speter/* Once we have tried two ways to fill a delay slot, make a pass over the 374418334Speter code to try to improve the results and to do such things as more jump 374518334Speter threading. */ 374618334Speter 374718334Speterstatic void 374818334Speterrelax_delay_slots (first) 374918334Speter rtx first; 375018334Speter{ 375118334Speter register rtx insn, next, pat; 375218334Speter register rtx trial, delay_insn, target_label; 375318334Speter 375418334Speter /* Look at every JUMP_INSN and see if we can improve it. */ 375518334Speter for (insn = first; insn; insn = next) 375618334Speter { 375718334Speter rtx other; 375818334Speter 375918334Speter next = next_active_insn (insn); 376018334Speter 376118334Speter /* If this is a jump insn, see if it now jumps to a jump, jumps to 376218334Speter the next insn, or jumps to a label that is not the last of a 376318334Speter group of consecutive labels. */ 376418334Speter if (GET_CODE (insn) == JUMP_INSN 376518334Speter && (condjump_p (insn) || condjump_in_parallel_p (insn)) 376618334Speter && (target_label = JUMP_LABEL (insn)) != 0) 376718334Speter { 376818334Speter target_label = follow_jumps (target_label); 376918334Speter target_label = prev_label (next_active_insn (target_label)); 377018334Speter 377118334Speter if (target_label == 0) 377218334Speter target_label = find_end_label (); 377318334Speter 377418334Speter if (next_active_insn (target_label) == next 377518334Speter && ! condjump_in_parallel_p (insn)) 377618334Speter { 377718334Speter delete_jump (insn); 377818334Speter continue; 377918334Speter } 378018334Speter 378118334Speter if (target_label != JUMP_LABEL (insn)) 378218334Speter reorg_redirect_jump (insn, target_label); 378318334Speter 378418334Speter /* See if this jump branches around a unconditional jump. 378518334Speter If so, invert this jump and point it to the target of the 378618334Speter second jump. */ 378718334Speter if (next && GET_CODE (next) == JUMP_INSN 378818334Speter && (simplejump_p (next) || GET_CODE (PATTERN (next)) == RETURN) 378918334Speter && next_active_insn (target_label) == next_active_insn (next) 379018334Speter && no_labels_between_p (insn, next)) 379118334Speter { 379218334Speter rtx label = JUMP_LABEL (next); 379318334Speter 379418334Speter /* Be careful how we do this to avoid deleting code or 379518334Speter labels that are momentarily dead. See similar optimization 379618334Speter in jump.c. 379718334Speter 379818334Speter We also need to ensure we properly handle the case when 379918334Speter invert_jump fails. */ 380018334Speter 380118334Speter ++LABEL_NUSES (target_label); 380218334Speter if (label) 380318334Speter ++LABEL_NUSES (label); 380418334Speter 380518334Speter if (invert_jump (insn, label)) 380618334Speter { 380718334Speter delete_insn (next); 380818334Speter next = insn; 380918334Speter } 381018334Speter 381118334Speter if (label) 381218334Speter --LABEL_NUSES (label); 381318334Speter 381418334Speter if (--LABEL_NUSES (target_label) == 0) 381518334Speter delete_insn (target_label); 381618334Speter 381718334Speter continue; 381818334Speter } 381918334Speter } 382018334Speter 382118334Speter /* If this is an unconditional jump and the previous insn is a 382218334Speter conditional jump, try reversing the condition of the previous 382318334Speter insn and swapping our targets. The next pass might be able to 382418334Speter fill the slots. 382518334Speter 382618334Speter Don't do this if we expect the conditional branch to be true, because 382718334Speter we would then be making the more common case longer. */ 382818334Speter 382918334Speter if (GET_CODE (insn) == JUMP_INSN 383018334Speter && (simplejump_p (insn) || GET_CODE (PATTERN (insn)) == RETURN) 383118334Speter && (other = prev_active_insn (insn)) != 0 383218334Speter && (condjump_p (other) || condjump_in_parallel_p (other)) 383318334Speter && no_labels_between_p (other, insn) 383418334Speter && 0 < mostly_true_jump (other, 383518334Speter get_branch_condition (other, 383618334Speter JUMP_LABEL (other)))) 383718334Speter { 383818334Speter rtx other_target = JUMP_LABEL (other); 383918334Speter target_label = JUMP_LABEL (insn); 384018334Speter 384118334Speter /* Increment the count of OTHER_TARGET, so it doesn't get deleted 384218334Speter as we move the label. */ 384318334Speter if (other_target) 384418334Speter ++LABEL_NUSES (other_target); 384518334Speter 384618334Speter if (invert_jump (other, target_label)) 384718334Speter reorg_redirect_jump (insn, other_target); 384818334Speter 384918334Speter if (other_target) 385018334Speter --LABEL_NUSES (other_target); 385118334Speter } 385218334Speter 385318334Speter /* Now look only at cases where we have filled a delay slot. */ 385418334Speter if (GET_CODE (insn) != INSN 385518334Speter || GET_CODE (PATTERN (insn)) != SEQUENCE) 385618334Speter continue; 385718334Speter 385818334Speter pat = PATTERN (insn); 385918334Speter delay_insn = XVECEXP (pat, 0, 0); 386018334Speter 386118334Speter /* See if the first insn in the delay slot is redundant with some 386218334Speter previous insn. Remove it from the delay slot if so; then set up 386318334Speter to reprocess this insn. */ 386418334Speter if (redundant_insn (XVECEXP (pat, 0, 1), delay_insn, 0)) 386518334Speter { 386618334Speter delete_from_delay_slot (XVECEXP (pat, 0, 1)); 386718334Speter next = prev_active_insn (next); 386818334Speter continue; 386918334Speter } 387018334Speter 387118334Speter /* Now look only at the cases where we have a filled JUMP_INSN. */ 387218334Speter if (GET_CODE (XVECEXP (PATTERN (insn), 0, 0)) != JUMP_INSN 387318334Speter || ! (condjump_p (XVECEXP (PATTERN (insn), 0, 0)) 387418334Speter || condjump_in_parallel_p (XVECEXP (PATTERN (insn), 0, 0)))) 387518334Speter continue; 387618334Speter 387718334Speter target_label = JUMP_LABEL (delay_insn); 387818334Speter 387918334Speter if (target_label) 388018334Speter { 388118334Speter /* If this jump goes to another unconditional jump, thread it, but 388218334Speter don't convert a jump into a RETURN here. */ 388318334Speter trial = follow_jumps (target_label); 388418334Speter /* We use next_real_insn instead of next_active_insn, so that 388518334Speter the special USE insns emitted by reorg won't be ignored. 388618334Speter If they are ignored, then they will get deleted if target_label 388718334Speter is now unreachable, and that would cause mark_target_live_regs 388818334Speter to fail. */ 388918334Speter trial = prev_label (next_real_insn (trial)); 389018334Speter if (trial == 0 && target_label != 0) 389118334Speter trial = find_end_label (); 389218334Speter 389318334Speter if (trial != target_label 389418334Speter && redirect_with_delay_slots_safe_p (delay_insn, trial, insn)) 389518334Speter { 389618334Speter reorg_redirect_jump (delay_insn, trial); 389718334Speter target_label = trial; 389818334Speter } 389918334Speter 390018334Speter /* If the first insn at TARGET_LABEL is redundant with a previous 390118334Speter insn, redirect the jump to the following insn process again. */ 390218334Speter trial = next_active_insn (target_label); 390318334Speter if (trial && GET_CODE (PATTERN (trial)) != SEQUENCE 390418334Speter && redundant_insn (trial, insn, 0)) 390518334Speter { 390618334Speter trial = next_active_insn (trial); 390718334Speter if (trial == 0) 390818334Speter target_label = find_end_label (); 390918334Speter else 391018334Speter target_label = get_label_before (trial); 391118334Speter reorg_redirect_jump (delay_insn, target_label); 391218334Speter next = insn; 391318334Speter continue; 391418334Speter } 391518334Speter 391618334Speter /* Similarly, if it is an unconditional jump with one insn in its 391718334Speter delay list and that insn is redundant, thread the jump. */ 391818334Speter if (trial && GET_CODE (PATTERN (trial)) == SEQUENCE 391918334Speter && XVECLEN (PATTERN (trial), 0) == 2 392018334Speter && GET_CODE (XVECEXP (PATTERN (trial), 0, 0)) == JUMP_INSN 392118334Speter && (simplejump_p (XVECEXP (PATTERN (trial), 0, 0)) 392218334Speter || GET_CODE (PATTERN (XVECEXP (PATTERN (trial), 0, 0))) == RETURN) 392318334Speter && redundant_insn (XVECEXP (PATTERN (trial), 0, 1), insn, 0)) 392418334Speter { 392518334Speter target_label = JUMP_LABEL (XVECEXP (PATTERN (trial), 0, 0)); 392618334Speter if (target_label == 0) 392718334Speter target_label = find_end_label (); 392818334Speter 392918334Speter if (redirect_with_delay_slots_safe_p (delay_insn, target_label, 393018334Speter insn)) 393118334Speter { 393218334Speter reorg_redirect_jump (delay_insn, target_label); 393318334Speter next = insn; 393418334Speter continue; 393518334Speter } 393618334Speter } 393718334Speter } 393818334Speter 393918334Speter if (! INSN_ANNULLED_BRANCH_P (delay_insn) 394018334Speter && prev_active_insn (target_label) == insn 394118334Speter && ! condjump_in_parallel_p (delay_insn) 394218334Speter#ifdef HAVE_cc0 394318334Speter /* If the last insn in the delay slot sets CC0 for some insn, 394418334Speter various code assumes that it is in a delay slot. We could 394518334Speter put it back where it belonged and delete the register notes, 394618334Speter but it doesn't seem worthwhile in this uncommon case. */ 394718334Speter && ! find_reg_note (XVECEXP (pat, 0, XVECLEN (pat, 0) - 1), 394818334Speter REG_CC_USER, NULL_RTX) 394918334Speter#endif 395018334Speter ) 395118334Speter { 395218334Speter int i; 395318334Speter 395418334Speter /* All this insn does is execute its delay list and jump to the 395518334Speter following insn. So delete the jump and just execute the delay 395618334Speter list insns. 395718334Speter 395818334Speter We do this by deleting the INSN containing the SEQUENCE, then 395918334Speter re-emitting the insns separately, and then deleting the jump. 396018334Speter This allows the count of the jump target to be properly 396118334Speter decremented. */ 396218334Speter 396318334Speter /* Clear the from target bit, since these insns are no longer 396418334Speter in delay slots. */ 396518334Speter for (i = 0; i < XVECLEN (pat, 0); i++) 396618334Speter INSN_FROM_TARGET_P (XVECEXP (pat, 0, i)) = 0; 396718334Speter 396818334Speter trial = PREV_INSN (insn); 396918334Speter delete_insn (insn); 397018334Speter emit_insn_after (pat, trial); 397118334Speter delete_scheduled_jump (delay_insn); 397218334Speter continue; 397318334Speter } 397418334Speter 397518334Speter /* See if this is an unconditional jump around a single insn which is 397618334Speter identical to the one in its delay slot. In this case, we can just 397718334Speter delete the branch and the insn in its delay slot. */ 397818334Speter if (next && GET_CODE (next) == INSN 397918334Speter && prev_label (next_active_insn (next)) == target_label 398018334Speter && simplejump_p (insn) 398118334Speter && XVECLEN (pat, 0) == 2 398218334Speter && rtx_equal_p (PATTERN (next), PATTERN (XVECEXP (pat, 0, 1)))) 398318334Speter { 398418334Speter delete_insn (insn); 398518334Speter continue; 398618334Speter } 398718334Speter 398818334Speter /* See if this jump (with its delay slots) branches around another 398918334Speter jump (without delay slots). If so, invert this jump and point 399018334Speter it to the target of the second jump. We cannot do this for 399118334Speter annulled jumps, though. Again, don't convert a jump to a RETURN 399218334Speter here. */ 399318334Speter if (! INSN_ANNULLED_BRANCH_P (delay_insn) 399418334Speter && next && GET_CODE (next) == JUMP_INSN 399518334Speter && (simplejump_p (next) || GET_CODE (PATTERN (next)) == RETURN) 399618334Speter && next_active_insn (target_label) == next_active_insn (next) 399718334Speter && no_labels_between_p (insn, next)) 399818334Speter { 399918334Speter rtx label = JUMP_LABEL (next); 400018334Speter rtx old_label = JUMP_LABEL (delay_insn); 400118334Speter 400218334Speter if (label == 0) 400318334Speter label = find_end_label (); 400418334Speter 400518334Speter if (redirect_with_delay_slots_safe_p (delay_insn, label, insn)) 400618334Speter { 400718334Speter /* Be careful how we do this to avoid deleting code or labels 400818334Speter that are momentarily dead. See similar optimization in 400918334Speter jump.c */ 401018334Speter if (old_label) 401118334Speter ++LABEL_NUSES (old_label); 401218334Speter 401318334Speter if (invert_jump (delay_insn, label)) 401418334Speter { 401518334Speter int i; 401618334Speter 401718334Speter /* Must update the INSN_FROM_TARGET_P bits now that 401818334Speter the branch is reversed, so that mark_target_live_regs 401918334Speter will handle the delay slot insn correctly. */ 402018334Speter for (i = 1; i < XVECLEN (PATTERN (insn), 0); i++) 402118334Speter { 402218334Speter rtx slot = XVECEXP (PATTERN (insn), 0, i); 402318334Speter INSN_FROM_TARGET_P (slot) = ! INSN_FROM_TARGET_P (slot); 402418334Speter } 402518334Speter 402618334Speter delete_insn (next); 402718334Speter next = insn; 402818334Speter } 402918334Speter 403018334Speter if (old_label && --LABEL_NUSES (old_label) == 0) 403118334Speter delete_insn (old_label); 403218334Speter continue; 403318334Speter } 403418334Speter } 403518334Speter 403618334Speter /* If we own the thread opposite the way this insn branches, see if we 403718334Speter can merge its delay slots with following insns. */ 403818334Speter if (INSN_FROM_TARGET_P (XVECEXP (pat, 0, 1)) 403918334Speter && own_thread_p (NEXT_INSN (insn), 0, 1)) 404018334Speter try_merge_delay_insns (insn, next); 404118334Speter else if (! INSN_FROM_TARGET_P (XVECEXP (pat, 0, 1)) 404218334Speter && own_thread_p (target_label, target_label, 0)) 404318334Speter try_merge_delay_insns (insn, next_active_insn (target_label)); 404418334Speter 404518334Speter /* If we get here, we haven't deleted INSN. But we may have deleted 404618334Speter NEXT, so recompute it. */ 404718334Speter next = next_active_insn (insn); 404818334Speter } 404918334Speter} 405018334Speter 405118334Speter#ifdef HAVE_return 405218334Speter 405318334Speter/* Look for filled jumps to the end of function label. We can try to convert 405418334Speter them into RETURN insns if the insns in the delay slot are valid for the 405518334Speter RETURN as well. */ 405618334Speter 405718334Speterstatic void 405818334Spetermake_return_insns (first) 405918334Speter rtx first; 406018334Speter{ 406118334Speter rtx insn, jump_insn, pat; 406218334Speter rtx real_return_label = end_of_function_label; 406318334Speter int slots, i; 406418334Speter 406518334Speter /* See if there is a RETURN insn in the function other than the one we 406618334Speter made for END_OF_FUNCTION_LABEL. If so, set up anything we can't change 406718334Speter into a RETURN to jump to it. */ 406818334Speter for (insn = first; insn; insn = NEXT_INSN (insn)) 406918334Speter if (GET_CODE (insn) == JUMP_INSN && GET_CODE (PATTERN (insn)) == RETURN) 407018334Speter { 407118334Speter real_return_label = get_label_before (insn); 407218334Speter break; 407318334Speter } 407418334Speter 407518334Speter /* Show an extra usage of REAL_RETURN_LABEL so it won't go away if it 407618334Speter was equal to END_OF_FUNCTION_LABEL. */ 407718334Speter LABEL_NUSES (real_return_label)++; 407818334Speter 407918334Speter /* Clear the list of insns to fill so we can use it. */ 408018334Speter obstack_free (&unfilled_slots_obstack, unfilled_firstobj); 408118334Speter 408218334Speter for (insn = first; insn; insn = NEXT_INSN (insn)) 408318334Speter { 408418334Speter int flags; 408518334Speter 408618334Speter /* Only look at filled JUMP_INSNs that go to the end of function 408718334Speter label. */ 408818334Speter if (GET_CODE (insn) != INSN 408918334Speter || GET_CODE (PATTERN (insn)) != SEQUENCE 409018334Speter || GET_CODE (XVECEXP (PATTERN (insn), 0, 0)) != JUMP_INSN 409118334Speter || JUMP_LABEL (XVECEXP (PATTERN (insn), 0, 0)) != end_of_function_label) 409218334Speter continue; 409318334Speter 409418334Speter pat = PATTERN (insn); 409518334Speter jump_insn = XVECEXP (pat, 0, 0); 409618334Speter 409718334Speter /* If we can't make the jump into a RETURN, try to redirect it to the best 409818334Speter RETURN and go on to the next insn. */ 409918334Speter if (! reorg_redirect_jump (jump_insn, NULL_RTX)) 410018334Speter { 410118334Speter /* Make sure redirecting the jump will not invalidate the delay 410218334Speter slot insns. */ 410318334Speter if (redirect_with_delay_slots_safe_p (jump_insn, 410418334Speter real_return_label, 410518334Speter insn)) 410618334Speter reorg_redirect_jump (jump_insn, real_return_label); 410718334Speter continue; 410818334Speter } 410918334Speter 411018334Speter /* See if this RETURN can accept the insns current in its delay slot. 411118334Speter It can if it has more or an equal number of slots and the contents 411218334Speter of each is valid. */ 411318334Speter 411418334Speter flags = get_jump_flags (jump_insn, JUMP_LABEL (jump_insn)); 411518334Speter slots = num_delay_slots (jump_insn); 411618334Speter if (slots >= XVECLEN (pat, 0) - 1) 411718334Speter { 411818334Speter for (i = 1; i < XVECLEN (pat, 0); i++) 411918334Speter if (! ( 412018334Speter#ifdef ANNUL_IFFALSE_SLOTS 412118334Speter (INSN_ANNULLED_BRANCH_P (jump_insn) 412218334Speter && INSN_FROM_TARGET_P (XVECEXP (pat, 0, i))) 412318334Speter ? eligible_for_annul_false (jump_insn, i - 1, 412418334Speter XVECEXP (pat, 0, i), flags) : 412518334Speter#endif 412618334Speter#ifdef ANNUL_IFTRUE_SLOTS 412718334Speter (INSN_ANNULLED_BRANCH_P (jump_insn) 412818334Speter && ! INSN_FROM_TARGET_P (XVECEXP (pat, 0, i))) 412918334Speter ? eligible_for_annul_true (jump_insn, i - 1, 413018334Speter XVECEXP (pat, 0, i), flags) : 413118334Speter#endif 413218334Speter eligible_for_delay (jump_insn, i -1, XVECEXP (pat, 0, i), flags))) 413318334Speter break; 413418334Speter } 413518334Speter else 413618334Speter i = 0; 413718334Speter 413818334Speter if (i == XVECLEN (pat, 0)) 413918334Speter continue; 414018334Speter 414118334Speter /* We have to do something with this insn. If it is an unconditional 414218334Speter RETURN, delete the SEQUENCE and output the individual insns, 414318334Speter followed by the RETURN. Then set things up so we try to find 414418334Speter insns for its delay slots, if it needs some. */ 414518334Speter if (GET_CODE (PATTERN (jump_insn)) == RETURN) 414618334Speter { 414718334Speter rtx prev = PREV_INSN (insn); 414818334Speter 414918334Speter delete_insn (insn); 415018334Speter for (i = 1; i < XVECLEN (pat, 0); i++) 415118334Speter prev = emit_insn_after (PATTERN (XVECEXP (pat, 0, i)), prev); 415218334Speter 415318334Speter insn = emit_jump_insn_after (PATTERN (jump_insn), prev); 415418334Speter emit_barrier_after (insn); 415518334Speter 415618334Speter if (slots) 415718334Speter obstack_ptr_grow (&unfilled_slots_obstack, insn); 415818334Speter } 415918334Speter else 416018334Speter /* It is probably more efficient to keep this with its current 416118334Speter delay slot as a branch to a RETURN. */ 416218334Speter reorg_redirect_jump (jump_insn, real_return_label); 416318334Speter } 416418334Speter 416518334Speter /* Now delete REAL_RETURN_LABEL if we never used it. Then try to fill any 416618334Speter new delay slots we have created. */ 416718334Speter if (--LABEL_NUSES (real_return_label) == 0) 416818334Speter delete_insn (real_return_label); 416918334Speter 417018334Speter fill_simple_delay_slots (first, 1); 417118334Speter fill_simple_delay_slots (first, 0); 417218334Speter} 417318334Speter#endif 417418334Speter 417518334Speter/* Try to find insns to place in delay slots. */ 417618334Speter 417718334Spetervoid 417818334Speterdbr_schedule (first, file) 417918334Speter rtx first; 418018334Speter FILE *file; 418118334Speter{ 418218334Speter rtx insn, next, epilogue_insn = 0; 418318334Speter int i; 418418334Speter#if 0 418518334Speter int old_flag_no_peephole = flag_no_peephole; 418618334Speter 418718334Speter /* Execute `final' once in prescan mode to delete any insns that won't be 418818334Speter used. Don't let final try to do any peephole optimization--it will 418918334Speter ruin dataflow information for this pass. */ 419018334Speter 419118334Speter flag_no_peephole = 1; 419218334Speter final (first, 0, NO_DEBUG, 1, 1); 419318334Speter flag_no_peephole = old_flag_no_peephole; 419418334Speter#endif 419518334Speter 419618334Speter /* If the current function has no insns other than the prologue and 419718334Speter epilogue, then do not try to fill any delay slots. */ 419818334Speter if (n_basic_blocks == 0) 419918334Speter return; 420018334Speter 420118334Speter /* Find the highest INSN_UID and allocate and initialize our map from 420218334Speter INSN_UID's to position in code. */ 420318334Speter for (max_uid = 0, insn = first; insn; insn = NEXT_INSN (insn)) 420418334Speter { 420518334Speter if (INSN_UID (insn) > max_uid) 420618334Speter max_uid = INSN_UID (insn); 420718334Speter if (GET_CODE (insn) == NOTE 420818334Speter && NOTE_LINE_NUMBER (insn) == NOTE_INSN_EPILOGUE_BEG) 420918334Speter epilogue_insn = insn; 421018334Speter } 421118334Speter 421218334Speter uid_to_ruid = (int *) alloca ((max_uid + 1) * sizeof (int *)); 421318334Speter for (i = 0, insn = first; insn; i++, insn = NEXT_INSN (insn)) 421418334Speter uid_to_ruid[INSN_UID (insn)] = i; 421518334Speter 421618334Speter /* Initialize the list of insns that need filling. */ 421718334Speter if (unfilled_firstobj == 0) 421818334Speter { 421918334Speter gcc_obstack_init (&unfilled_slots_obstack); 422018334Speter unfilled_firstobj = (rtx *) obstack_alloc (&unfilled_slots_obstack, 0); 422118334Speter } 422218334Speter 422318334Speter for (insn = next_active_insn (first); insn; insn = next_active_insn (insn)) 422418334Speter { 422518334Speter rtx target; 422618334Speter 422718334Speter INSN_ANNULLED_BRANCH_P (insn) = 0; 422818334Speter INSN_FROM_TARGET_P (insn) = 0; 422918334Speter 423018334Speter /* Skip vector tables. We can't get attributes for them. */ 423118334Speter if (GET_CODE (insn) == JUMP_INSN 423218334Speter && (GET_CODE (PATTERN (insn)) == ADDR_VEC 423318334Speter || GET_CODE (PATTERN (insn)) == ADDR_DIFF_VEC)) 423418334Speter continue; 423518334Speter 423618334Speter if (num_delay_slots (insn) > 0) 423718334Speter obstack_ptr_grow (&unfilled_slots_obstack, insn); 423818334Speter 423918334Speter /* Ensure all jumps go to the last of a set of consecutive labels. */ 424018334Speter if (GET_CODE (insn) == JUMP_INSN 424118334Speter && (condjump_p (insn) || condjump_in_parallel_p (insn)) 424218334Speter && JUMP_LABEL (insn) != 0 424318334Speter && ((target = prev_label (next_active_insn (JUMP_LABEL (insn)))) 424418334Speter != JUMP_LABEL (insn))) 424518334Speter redirect_jump (insn, target); 424618334Speter } 424718334Speter 424818334Speter /* Indicate what resources are required to be valid at the end of the current 424918334Speter function. The condition code never is and memory always is. If the 425018334Speter frame pointer is needed, it is and so is the stack pointer unless 425118334Speter EXIT_IGNORE_STACK is non-zero. If the frame pointer is not needed, the 425218334Speter stack pointer is. Registers used to return the function value are 425318334Speter needed. Registers holding global variables are needed. */ 425418334Speter 425518334Speter end_of_function_needs.cc = 0; 425618334Speter end_of_function_needs.memory = 1; 425718334Speter end_of_function_needs.unch_memory = 0; 425818334Speter CLEAR_HARD_REG_SET (end_of_function_needs.regs); 425918334Speter 426018334Speter if (frame_pointer_needed) 426118334Speter { 426218334Speter SET_HARD_REG_BIT (end_of_function_needs.regs, FRAME_POINTER_REGNUM); 426318334Speter#if HARD_FRAME_POINTER_REGNUM != FRAME_POINTER_REGNUM 426418334Speter SET_HARD_REG_BIT (end_of_function_needs.regs, HARD_FRAME_POINTER_REGNUM); 426518334Speter#endif 426618334Speter#ifdef EXIT_IGNORE_STACK 426718334Speter if (! EXIT_IGNORE_STACK) 426818334Speter#endif 426918334Speter SET_HARD_REG_BIT (end_of_function_needs.regs, STACK_POINTER_REGNUM); 427018334Speter } 427118334Speter else 427218334Speter SET_HARD_REG_BIT (end_of_function_needs.regs, STACK_POINTER_REGNUM); 427318334Speter 427418334Speter if (current_function_return_rtx != 0 427518334Speter && GET_CODE (current_function_return_rtx) == REG) 427618334Speter mark_referenced_resources (current_function_return_rtx, 427718334Speter &end_of_function_needs, 1); 427818334Speter 427918334Speter for (i = 0; i < FIRST_PSEUDO_REGISTER; i++) 428018334Speter if (global_regs[i]) 428118334Speter SET_HARD_REG_BIT (end_of_function_needs.regs, i); 428218334Speter 428318334Speter /* The registers required to be live at the end of the function are 428418334Speter represented in the flow information as being dead just prior to 428518334Speter reaching the end of the function. For example, the return of a value 428618334Speter might be represented by a USE of the return register immediately 428718334Speter followed by an unconditional jump to the return label where the 428818334Speter return label is the end of the RTL chain. The end of the RTL chain 428918334Speter is then taken to mean that the return register is live. 429018334Speter 429118334Speter This sequence is no longer maintained when epilogue instructions are 429218334Speter added to the RTL chain. To reconstruct the original meaning, the 429318334Speter start of the epilogue (NOTE_INSN_EPILOGUE_BEG) is regarded as the 429418334Speter point where these registers become live (start_of_epilogue_needs). 429518334Speter If epilogue instructions are present, the registers set by those 429618334Speter instructions won't have been processed by flow. Thus, those 429718334Speter registers are additionally required at the end of the RTL chain 429818334Speter (end_of_function_needs). */ 429918334Speter 430018334Speter start_of_epilogue_needs = end_of_function_needs; 430118334Speter 430218334Speter while (epilogue_insn = next_nonnote_insn (epilogue_insn)) 430318334Speter mark_set_resources (epilogue_insn, &end_of_function_needs, 0, 1); 430418334Speter 430518334Speter /* Show we haven't computed an end-of-function label yet. */ 430618334Speter end_of_function_label = 0; 430718334Speter 430818334Speter /* Allocate and initialize the tables used by mark_target_live_regs. */ 430918334Speter target_hash_table 431018334Speter = (struct target_info **) alloca ((TARGET_HASH_PRIME 431118334Speter * sizeof (struct target_info *))); 431218334Speter bzero ((char *) target_hash_table, 431318334Speter TARGET_HASH_PRIME * sizeof (struct target_info *)); 431418334Speter 431518334Speter bb_ticks = (int *) alloca (n_basic_blocks * sizeof (int)); 431618334Speter bzero ((char *) bb_ticks, n_basic_blocks * sizeof (int)); 431718334Speter 431818334Speter /* Initialize the statistics for this function. */ 431918334Speter bzero ((char *) num_insns_needing_delays, sizeof num_insns_needing_delays); 432018334Speter bzero ((char *) num_filled_delays, sizeof num_filled_delays); 432118334Speter 432218334Speter /* Now do the delay slot filling. Try everything twice in case earlier 432318334Speter changes make more slots fillable. */ 432418334Speter 432518334Speter for (reorg_pass_number = 0; 432618334Speter reorg_pass_number < MAX_REORG_PASSES; 432718334Speter reorg_pass_number++) 432818334Speter { 432918334Speter fill_simple_delay_slots (first, 1); 433018334Speter fill_simple_delay_slots (first, 0); 433118334Speter fill_eager_delay_slots (first); 433218334Speter relax_delay_slots (first); 433318334Speter } 433418334Speter 433518334Speter /* Delete any USE insns made by update_block; subsequent passes don't need 433618334Speter them or know how to deal with them. */ 433718334Speter for (insn = first; insn; insn = next) 433818334Speter { 433918334Speter next = NEXT_INSN (insn); 434018334Speter 434118334Speter if (GET_CODE (insn) == INSN && GET_CODE (PATTERN (insn)) == USE 434218334Speter && GET_RTX_CLASS (GET_CODE (XEXP (PATTERN (insn), 0))) == 'i') 434318334Speter next = delete_insn (insn); 434418334Speter } 434518334Speter 434618334Speter /* If we made an end of function label, indicate that it is now 434718334Speter safe to delete it by undoing our prior adjustment to LABEL_NUSES. 434818334Speter If it is now unused, delete it. */ 434918334Speter if (end_of_function_label && --LABEL_NUSES (end_of_function_label) == 0) 435018334Speter delete_insn (end_of_function_label); 435118334Speter 435218334Speter#ifdef HAVE_return 435318334Speter if (HAVE_return && end_of_function_label != 0) 435418334Speter make_return_insns (first); 435518334Speter#endif 435618334Speter 435718334Speter obstack_free (&unfilled_slots_obstack, unfilled_firstobj); 435818334Speter 435918334Speter /* It is not clear why the line below is needed, but it does seem to be. */ 436018334Speter unfilled_firstobj = (rtx *) obstack_alloc (&unfilled_slots_obstack, 0); 436118334Speter 436218334Speter /* Reposition the prologue and epilogue notes in case we moved the 436318334Speter prologue/epilogue insns. */ 436418334Speter reposition_prologue_and_epilogue_notes (first); 436518334Speter 436618334Speter if (file) 436718334Speter { 436818334Speter register int i, j, need_comma; 436918334Speter 437018334Speter for (reorg_pass_number = 0; 437118334Speter reorg_pass_number < MAX_REORG_PASSES; 437218334Speter reorg_pass_number++) 437318334Speter { 437418334Speter fprintf (file, ";; Reorg pass #%d:\n", reorg_pass_number + 1); 437518334Speter for (i = 0; i < NUM_REORG_FUNCTIONS; i++) 437618334Speter { 437718334Speter need_comma = 0; 437818334Speter fprintf (file, ";; Reorg function #%d\n", i); 437918334Speter 438018334Speter fprintf (file, ";; %d insns needing delay slots\n;; ", 438118334Speter num_insns_needing_delays[i][reorg_pass_number]); 438218334Speter 438318334Speter for (j = 0; j < MAX_DELAY_HISTOGRAM; j++) 438418334Speter if (num_filled_delays[i][j][reorg_pass_number]) 438518334Speter { 438618334Speter if (need_comma) 438718334Speter fprintf (file, ", "); 438818334Speter need_comma = 1; 438918334Speter fprintf (file, "%d got %d delays", 439018334Speter num_filled_delays[i][j][reorg_pass_number], j); 439118334Speter } 439218334Speter fprintf (file, "\n"); 439318334Speter } 439418334Speter } 439518334Speter } 439618334Speter} 439718334Speter#endif /* DELAY_SLOTS */ 4398