The bit-flip learning algorithm (Wuensche 1992a) changes particular bits in rule tables to make the target state the actual successor of each aspiring pre-image, instead of its original successor under the original network parameters.
The procedure must succeed. Pre-existing (or just learnt) pre-images can not be ``forgotten``, i.e. detached as pre-images of the target state, by learning more states as pre-images. However other states not on the list of aspiring pre-images may also be learnt, and side effects will occur elsewhere in the attractor basin.
To ``forget'' a pre-image just one of a set of bit-flips is required. This is chosen from the set at random. As fewer network changes are needed there will be fewer side effects.
If ``bit-flips'' is selected at #27.9, the required changes to the network will rapidly be made and the following prompt
more-m
Enter ``m'' to repeat the procedure from #27.5.