Using a stack
As can be observed in Knuth Bendix completion - an example after we have added rule 6, there was no point considering rule 5, since it was rendered redundant. This can be achieved by maintaining the reducedness property of the rewriting system during the completion.
KnuthBendix.KBStack
— TypeKBStack <: KBS2Alg <: CompletionAlgorithm
The Knuth-Bendix completion algorithm that (if successful) yields the reduced, confluent rewriting system generated by rules of rws
.
KBStack
uses a stack of new rules and with each addition of a new rule to the rewriting system all rules potentially redundant are moved onto the stack. This way the rewriting system is always reduced and only the non-redundant rules are considered for finding critical pairs.
This implementation follows closely KBS_2
procedure as described in Section 2.6[Sims1994], p. 76.
Forced termination takes place after the number of active rules stored in the RewritingSystem
reaches max_rules
.
You can run it on a rws
by calling
knuthbendix(KnuthBendix.Settings(KnuthBendix.KBStack()), rws)
By default the algorithm terminates early after 500
of active rewriting rules have been found. To control this behaviour Settings
struct has many knobs and whistles.
While KBStack
vastly outperforms the naive KBPlain
it is still rather slow. E.g. the eight quotient of (2,3,7) triangle group
\[\langle a,b \mid a^2 = b^3 = (ab)^7 = \left(a^{-1}b^{-1}ab\right)^8 \rangle.\]
which has a confluent rewriting system with 1023
rules is still too large to be completed succesfully.
There are two main performance problems here:
- the poor performance of naive rewriting that is still used by
KBStack
: the complexity of this rewriting depends on the overall size of the rewriting system. - the fact that
find_critical_pairs!
andderiverules!
assume and maintain the reducedness of the rewriting system makes their complexity quadratic with the size of the rewriting system and therefore become the bottleneck for larger rewriting systems.
To address both problems we will use the theory of Automata and regular languages in KBIndex
.
Below is a julia-flavoured pseudocode describing the completion procedure.
function knuthbendix2(rws::RewritingSystem{W}, ...) where W
stack = Vector{Tuple{W,W}}()
rws = reduce(rws) # this is assumed by forceconfluence!
# ... initialize some temporary structures here
for ri in rules(rws)
for rj in rules(rws)
isactive(ri) || break
forceconfluence!(rws, stack, ri, rj, ...)
ri === rj && break
isactive(ri) || break
isactive(rj) || continue
forceconfluence!(rws, stack, rj, ri, ...)
end
end
remove_inactive!(rws) # all active rules form a reduced confluent rws
return rws
end
Not much has changed compared to KBPlain
on the surface, but there are more substantial changes under the hood. In particular forceconfluence!
has become simply
function forceconfluence!(rws::RewritingSystem, stack, ri, rj, ...)
find_critical_pairs!(stack, rws, ri, rj, ...)
deriverules!(rws, stack, ...)
return rws
end
I.e. we first push all failures to local confluence derived from ri
and rj
onto stack
(find_critical_pairs!
), then empty the stack (deriverule!
). To do this the top pair lhs → rhs
is picked from the stack, we mark all the rules in rws
that could be reduced with lhs
as inactive and push them onto the stack. Only afterwards we push lhs → rhs
to rws
and we repeat until the stack is empty. More concretely, in julia-flavoured pseudocode,
while !isempty(stack)
(a, b) = pop!(stack)
A, B = rewrite(a, rws), rewrite(b, rws)
if A ≠ B
new_rule = Rule(A,B, ordering(rws))
for rule in rules(rws)
if isreducible(rule, new_rule)
deactivate!(rule)
push!(stack, rule)
end
end
push!(rws, new_rule)
end
end
Note that in KBStack
we can break
(or continue
) in the internal loop on the inactivity of rules as specified in the listing above. As in the process of completion the rule ri
being currently processed could have been marked as inactive (e.g. it became redundant) it is advisable that those checks are performed after every call to forceconfluence!
.
Internal functions
KnuthBendix.forceconfluence!
— Methodforceconfluence!(rws::RewritingSystem, stack, r₁, r₂[, work = Workspace(rws)])
Examine overlaps of left hand sides of rules r₁
and r₂
to find (potential) failures to local confluence. New rules are added to assure local confluence if necessary.
This version assumes the reducedness of rws
so that failures to local confluence are of the form a·b·c
with all a
, b
, c
non trivial and lhs₁ = a·b
and lhs₂ = b·c
.
This version uses stack
to maintain the reducedness of rws
and work::Workspace
to save allocations in the rewriting.
See procedure OVERLAP_2
in [Sims, p. 77].
KnuthBendix.find_critical_pairs!
— Methodfind_critical_pairs!(stack, rws, r₁::Rule, r₂::Rule[, work=Workspace(rws))
Find critical pairs derived from suffix-prefix overlaps of lhses of r₁
and r₂
.
Such failures (i.e. failures to local confluence) arise as W = ABC
where AB
, BC
are the left-hand-sides of rules r₁
and r₂
, respectively. It is assumed that all A
, B
and C
are nonempty.
KnuthBendix.deriverule!
— Methodderiverule!(rws::RewritingSystem, stack[, work = Workspace(rws)])
Empty stack
of (potentially) critical pairs by deriving and adding new rules to rws
resolving the pairs, i.e. maintains local confluence of rws
.
This function may deactivate rules in rws
if they are deemed redundant (e.g. follow from the added new rules). See [Sims, p. 76].
KnuthBendix.reduce!
— Methodreduce!(::KBS2Alg, rws::RewritingSystem[, work=Workspace(rws); sort_rules=true])
Bring rws
to its reduced form using the stack-based algorithm.
For details see reduce!
.
reduce!(::KBS2Alg, rws::RewritingSystem, stack, ...)
Append rules from stack
to rws
, maintaining reducedness.
Assuming that rws
is reduced, merge stack
of rules into rws
using deriverule!
.
- Sims1994Charles C. Sims Computation with finitely presented groups, Cambridge University Press, 1994.