Misplaced Pages

Word equation

Article snapshot taken from[REDACTED] with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Relation in Theoretical Computer Science
This article may be too technical for most readers to understand. Please help improve it to make it understandable to non-experts, without removing the technical details. (October 2024) (Learn how and when to remove this message)

A word equation is a formal equality E := u = v {\displaystyle E:=u{\overset {\cdot }{=}}v} between a pair of words u {\displaystyle u}  and v {\displaystyle v} , each over an alphabet Σ Ξ {\displaystyle \Sigma \cup \Xi }  comprising both constants (c.f. Σ {\displaystyle \Sigma } ) and unknowns (c.f. Ξ {\displaystyle \Xi } ).  An assignment h {\displaystyle h}  of constant words to the unknowns of E {\displaystyle E}  is said to solve E {\displaystyle E}  if it maps both sides of E {\displaystyle E}  to identical words. In other words, the solutions of E {\displaystyle E}  are those morphisms h : ( Σ Ξ ) Σ {\displaystyle h:(\Sigma \cup \Xi )^{*}\to \Sigma ^{*}}  whose restriction to Σ {\displaystyle \Sigma }  is the identity map, and which satisfy h ( u ) = h ( v ) {\displaystyle h(u)=h(v)} . Word equations are a central object in combinatorics on words; they play an analogous role in this area as do Diophantine equations in number theory. One stark difference is that Diophantine equations have an undecidable solubility problem, whereas the analogous problem for word equations is decidable.

A classical example of a word equation is the commutation equation x w = w x {\displaystyle xw{\overset {\cdot }{=}}wx} , in which x {\displaystyle x}  is an unknown and w {\displaystyle w}  is a constant word. It is well-known that the solutions of the commutation equation are exactly those morphisms h {\displaystyle h}  mapping x {\displaystyle x}  to some power of w {\displaystyle w} . Another example is the conjugacy equation x z = z y {\displaystyle xz{\overset {\cdot }{=}}zy} , in which x , y , {\displaystyle x,y,}  and z {\displaystyle z}  are all unknowns. The solutions of this equation are precisely those morphisms h {\displaystyle h}  sending x {\displaystyle x}  and y {\displaystyle y}  to conjugate words, with the image h ( z ) {\displaystyle h(z)}  being filled in as appropriate.

Many subclasses of word equations have been introduced, some of which include:

  • constant-free equations, which are those u = v {\displaystyle u{\overset {\cdot }{=}}v}  such that u , v {\displaystyle u,v}  comprise unknowns only. Such equations have a trivial solution wherein all their unknowns are erased; as such, they are usually studied over free semigroups.
  • quadratic equations, which are those containing each of their unknowns at most twice. This is exactly the class of word equations on which the Nielsen Transformations algorithm (c.f. below) terminates.
  • word equations in one unknown, which can be checked for their solubility in linear time.

History

The study of word equations was initiated by Willard Quine as early as 1946. Quine proved that the first-order theory of word equations is essentially equivalent to the first-order theory of arithmetic. In 1954, Andrey Markov coined the term "word equation", and introduced the solubility problem for them: decide whether a given word equation admits a solution. For a long time, it was hoped that this problem was undecidable. One reason for this is that it was expected, (incorrectly, it turns out), that word equations might provide an intermediary step between Hilbert's Tenth Problem and the undecidable problems relating to Turing machines.

Gennady Makanin, who introduced Makanin's algorithm.

Further contributions were made in the early 1970s with the work of André Lentin and Juri Ilich Hmelevskii. In 1976, Gennady Makanin introduced a method by which it could be determined whether any given word equation admitted a solution. That this procedure, which has come to be known as Makanin's algorithm, exists is very difficult to prove, and it is one of the most celebrated results in combinatorics on words. Makanin's algorithm is considered to be one of the most conceptually difficult existing in literature, and it is also highly intractable, requiring (in its initial formulation) triply exponential time. Thus, there were many attempts to improve upon it. In 1999, Wojciech Plandowski introduced a novel algorithm, showing that the solubility problem for word equations is in PSPACE.

In 2006, Plandowski and Wojciech Rytter showed that minimal solutions of word equations are highly (i.e., exponentially) compressible using Lempel-Ziv encoding. It is conjectured that the length of a minimal solution of a word equation E {\displaystyle E}  is (at most) singly exponential in the length | E | {\displaystyle |E|}  of E {\displaystyle E} . If this conjecture is true, then Plandowski and Rytter's result yields a straightforward "guess-and-verify" NP algorithm for the solubility problem: they show that a solution h {\displaystyle h}  can be verified whilst working only with its LZ-compressed representation L Z ( h ) {\displaystyle LZ(h)} , and the conjecture being true would imply that L Z ( h ) {\displaystyle LZ(h)}  has size polynomial in | E | {\displaystyle |E|} . As it stands, the last part of the complexity analysis—the question as to whether solving word equations is NP-complete—remains open. (NP-hardness follows immediately from the fact that solving word equations generalises the NP-complete problem of pattern matching).

Methods of solution

There is no "elementary" algorithm for determining whether a given word equation admits a solution. The algorithms mentioned above are all of theoretical interest, but they'll not help in solving a word equation by hand, for instance. There exist however a few methods that can sometimes help with this:

Length arguments

Because a solution h {\displaystyle h}  to a word equation E {\displaystyle E}  must unify its two sides, one can use the multiset of symbols occurring on either side of E {\displaystyle E}  to deduce a linear equality in the lengths of the images of the unknowns. For instance, the form of x x = y {\displaystyle xx{\overset {\cdot }{=}}y}  implies that its solutions h {\displaystyle h}  must satisfy | h ( y ) | = 2 | h ( x ) | {\displaystyle |h(y)|=2|h(x)|} , which narrows down the set of possible h {\displaystyle h}  to check. Similar arguments can allow for a word equation E {\displaystyle E}  to be "split up" into smaller ones if it can be deduced positions within the two sides of E {\displaystyle E}  which must line up in all solutions of E {\displaystyle E} . For instance, the midpoints of each side of x y y x = u v v u {\displaystyle xyyx{\overset {\cdot }{=}}uvvu}  can be detected via a length argument, and hence that word equation can be split into the system { x y = u v y x = v u {\displaystyle \left\lbrace {\begin{matrix}xy{\overset {\cdot }{=}}uv\\yx{\overset {\cdot }{=}}vu\end{matrix}}\right.} .

Appeal to periodicity

Another useful tool for reasoning about word equations is the Periodicity Lemma of Fine and Wilf, which describes what happens if a certain word has multiple periods (i.e., distances at which its letters repeat). Consider, for instance, the word equation E := x 2 y 2 = z 2 {\displaystyle E:=x^{2}y^{2}{\overset {\cdot }{=}}z^{2}} . Suppose that ( x , y , z ) ( w x , w y , w z ) {\displaystyle (x,y,z)\mapsto (w_{x},w_{y},w_{z})}  is one of its solutions. Then w x 2 w y 2 = w z 2 {\displaystyle w_{x}^{2}w_{y}^{2}=w_{z}^{2}} . By taking a suitable conjugacy in this identity, one can infer that there exists some conjugate z {\displaystyle z'}  of z {\displaystyle z}  which is such that z 2 = w x w y 2 w x {\displaystyle z'^{2}=w_{x}w_{y}^{2}w_{x}} . Now a length argument permits for the midpoint of each side to be identified here, and it follows from this observation that w x w y = z = w y w x {\displaystyle w_{x}w_{y}=z'=w_{y}w_{x}} . Herein is the commutation equation, whence w x {\displaystyle w_{x}}  and w y {\displaystyle w_{y}}  are powers of a common word. Now the infinite words w x ω {\displaystyle w_{x}^{\omega }}  and w z ω {\displaystyle w_{z}^{\omega }}  have a common prefix of length | w z 2 | {\displaystyle |w_{z}^{2}|} . Since | w z 2 | = 2 | w z | | w z | + | w x | | w z | + | w x | gcd ( | z | , | x | ) {\displaystyle |w_{z}^{2}|=2|w_{z}|\geq |w_{z}|+|w_{x}|\geq |w_{z}|+|w_{x}|-\gcd(|z|,|x|)} , the Periodicity Lemma can be applied. Its conclusion here is that w z {\displaystyle w_{z}}  and w x {\displaystyle w_{x}}  are powers of a common word too. Thus, every solution h {\displaystyle h}  of E {\displaystyle E}  maps x , y , {\displaystyle x,y,}  and z {\displaystyle z}  to powers of a common word.

Nielsen transformations

The workings of the Nielsen transformations algorithm on the word equation X a Y = Y b X {\displaystyle XaY{\overset {\cdot }{=}}YbX} . The trivial word equation was not reached, implying the equation’s insolubility.

Let E := x u = y v {\displaystyle E:=xu{\overset {\cdot }{=}}yv}  be a word equation, such that x , y Σ Ξ {\displaystyle x,y\in \Sigma \cup \Xi } , and u , v ( Σ Ξ ) {\displaystyle u,v\in (\Sigma \cup \Xi )^{*}} . Here shall be presented a conceptually simple method (called Nielsen transformations algorithm, or Levi's Method.) to determine whether E {\displaystyle E}  is soluble, with the caveat that the method terminates only on quadratic word equations (as defined above).

The idea of the algorithm is to "guess" how the lengths of x {\displaystyle x}  and y {\displaystyle y}  compare in some solution of E {\displaystyle E} . Either | h ( x ) | > | h ( y ) | {\displaystyle |h(x)|>|h(y)|} , | h ( x ) | < | h ( y ) | {\displaystyle |h(x)|<|h(y)|} , or h ( x ) = h ( y ) {\displaystyle h(x)=h(y)} . In the first case, one can apply the string-rewriting rule x y x {\displaystyle x\mapsto yx}  to E {\displaystyle E} , where x {\displaystyle x}  (after the rewriting) is a new quantity whose meaning is "what's left of the old x {\displaystyle x}  once y {\displaystyle y}  is removed". Symmetrically, in the second case, one can apply the rule y x y {\displaystyle y\mapsto xy} , and in the third case x y {\displaystyle x\mapsto y} . The present method actually makes all three guesses; for each of them (separately) it rewrites E {\displaystyle E}  to account for the guess having been made.

By construction, there will be some cancellation at the start of E {\displaystyle E}  after applying each string-rewriting rule. (For instance, applying x a x {\displaystyle x\mapsto ax}  to the equation x a b = a b x {\displaystyle xab{\overset {\cdot }{=}}abx}  yields a x a b = a b a x {\displaystyle axab{\overset {\cdot }{=}}abax} , which cancels down to x a b = b a x {\displaystyle xab{\overset {\cdot }{=}}bax} ). The method always takes advantage of this cancellation; the hope is that it is enough to counteract the string-rewriting rule, which (in general) will have made the equation longer.

The algorithm thus amounts to exhaustively applying these transformations. It is natural to view the workings of the algorithm as the construction of a graph G ( E ) {\displaystyle {\mathcal {G}}(E)} , whose nodes are the reached equations, and edges are the transformations between them. If the trivial word equation ε = ε {\displaystyle \varepsilon {\overset {\cdot }{=}}\varepsilon } , (where ε {\displaystyle \varepsilon }  is the empty word) is ever encountered during this construction, then E {\displaystyle E}  is surely solvable. Conversely, if E {\displaystyle E}  is soluble, then ε = ε {\displaystyle \varepsilon {\overset {\cdot }{=}}\varepsilon }  must appear in G ( E ) {\displaystyle {\mathcal {G}}(E)} . So, by this method (assuming G ( E ) {\displaystyle {\mathcal {G}}(E)}  is finite) it can be determined whether E {\displaystyle E}  admits a solution.

Systems of word equations

One can define systems of word equations in the natural way. A solution of such a system S {\displaystyle S}  is a morphism h {\displaystyle h}  that solves simultaneously every equation in S {\displaystyle S} .  A natural extension is to consider Boolean formulas of word equations, in which also negation and disjunction is allowed. In fact, every system (and even every Boolean formula) of word equations, is equivalent to a single word equation. Thus, many results on word equations generalise immediately to such systems (resp. formulas). It must be said, however, that the transformation into a single word equation can introduce extra unknowns, and this is sometimes by necessity.

Two word equations (or systems thereof) are called equivalent if they have the same set of solutions. A system of word equations is called independent if it is not equivalent to any of its proper subsystems. Put another way, an independent system S {\displaystyle S}  of word equations is one such that every E S {\displaystyle E\in S}  can be solved "independently", i.e., without solving any of the other E S {\displaystyle E'\in S} . An interesting compactness theorem, usually bearing the name of Andrzej Ehrenfeucht, states that an infinite system of word equations, and with a finite number of unknowns, is necessarily equivalent to one of its finite subsystems. It follows that any independent system of word equations with a finite number of unknowns is itself finite.

Expressing formal languages and relations

Word equations can be used to characterise properties of (tuples of) words. For instance, a word ends in a {\displaystyle a}  if and only if it is the image h ( x ) {\displaystyle h(x)}  in some solution of the word equation x = y a {\displaystyle x{\overset {\cdot }{=}}ya} . Similarly, two words commute if and only if they are the images h ( x ) , h ( y ) {\displaystyle h(x),h(y)}  in some solution of the word equation x y = y x {\displaystyle xy{\overset {\cdot }{=}}yx} . In this sense, word equations can be thought of as mechanisms for expressing formal languages, in analogy with automata and formal grammars. It is not known exactly which properties of (tuples of) words are expressible in via word equations in this way. In particular, to show that a relation is inexpressible by word equations is often quite challenging. (An example of an inexpressible property is " x {\displaystyle x} is primitive").

It should be also noted that even characterising the solution set of a single word equation is complicated. Hmelevskii proved that, although the solutions to three-unknown constant-free equations can be given in terms of finite expressions with word and integer parameters, this is not true (in general) for four-unknown constant-free equations. In fact, x y z = z t x {\displaystyle xyz{\overset {\cdot }{=}}ztx}  is an example of such a "non-parametrisable" word equation.

Extended theories and connections to string solving

One can augment word equations with other types of constraints on the values h ( x ) {\displaystyle h(x)} , x Ξ {\displaystyle x\in \Xi } . For instance, in 1968, Yuri Matiyasevich considered an extension of word equations by "length constraints" as a possible tool for showing the unsolvability of Hilbert's tenth problem. These length constraints amounted to linear inequalities in the unknowns | h ( x ) | {\displaystyle |h(x)|} , x Ξ {\displaystyle x\in \Xi } . Sometimes, allowing extra constraints (alongside word equations) leads to theories with undecidable solubility problems, but it is also possible to add less powerful constraints and end up with a theory that's still decidable. An example of the former type of constraint is requiring that some h ( x ) , h ( y ) {\displaystyle h(x),h(y)}  should be Abelian equivalent (i.e., anagrams of one another); an example of the latter type is requiring that some h ( x ) {\displaystyle h(x)}  should belong to a given regular language L {\displaystyle L} . For Matiyasevich's extension with the length constraints, the solubility problem still has open decidability status.

There has been recent interest in the theory of word equations (and more general theories based on it), from the practical point of view of those developing software verification tools called string solvers. These tools, which are increasingly popular, seek to solve algorithmically constraint satisfaction problems about strings. Such problems take the form of a set of constraints, which an unknown set of strings must satisfy. The string solver should then determine whether strings exist which satisfy all the given constraints. A typical goal of such a tool would be to guarantee that a particular piece of software was free from some string-related vulnerability, such as cross-site scripting or code injection.

The building blocks of the constraints used in these tools are the standard questions one might ask of strings, such as "is x {\displaystyle x}  a substring of y {\displaystyle y} ?", "what is the length of string x {\displaystyle x} ?", and "what is the index of string x {\displaystyle x}  in string y {\displaystyle y} ?". Ostensibly, these constraints can be modelled by theories based on word equations, and as such, string solver tools must be capable of dealing with these theories algorithmically, (at least in the subcase of those equations and formulas that actually arise in practice).

Relation to the defect effect

The defect theorem is a central result to combinatorics on words. It says that, if a set X {\displaystyle X}  of words satisfies a nontrivial relation, then the words of X {\displaystyle X}  can be (simultaneously) expressed as powers of | X | r {\displaystyle |X|-r}  words, where r 1 {\displaystyle r\geq 1} . Such a set X {\displaystyle X}  is then said to possess a "defect effect" of order r {\displaystyle r} . Systems of word equations, (at least "nontrivial" ones), express the fact that a certain finite set of words, (namely the images h ( x ) {\displaystyle h(x)}  of the unknowns x Ξ {\displaystyle x\in \Xi } ), satisfy some nontrivial relation(s). So it can be said that systems S {\displaystyle S}  of word equations cause a defect effect in the sets of words X = { h ( x ) : x Ξ } {\displaystyle X=\{h(x):x\in \Xi \}}  coming from solutions h {\displaystyle h}  of S {\displaystyle S} .

The defect effect caused by certain systems of word equations has been studied., and there exist some surprising results to this end showing that the "dimensionality properties" of sets of words are actually quite weak. For instance, it is known that here exists an independent system S {\displaystyle S}  of equations of size Ω ( n 2 ) {\displaystyle \Omega (n^{2})} , containing n {\displaystyle n}  unknowns, which is such that S {\displaystyle S}  causes only a defect effect of order 1 {\displaystyle 1} .

Role within abstract algebra

There has been much research into formulating and solving equations within different structures of abstract algebra (e.g., groups and semigroups). Word equations, as presented here, are simply equations in free monoids. Equations in free semigroups are closely related to these; in fact, they are just word equations with the additional requirement that the solution morphism h {\displaystyle h}  is nonerasing. One can also consider equations in free groups, although the theory of such objects differs in many ways from the discussion presented here. Another result of Makanin's states that the solubility problem for equations in free groups is again decidable.

References

  1. ^ Lothaire, M., ed. (1997). Combinatorics on Words. Cambridge Mathematical Library (2 ed.). Cambridge: Cambridge University Press. doi:10.1017/cbo9780511566097. ISBN 978-0-521-59924-5.
  2. Cooper, S. Barry (2017-09-06). Computability Theory. doi:10.1201/9781315275789. ISBN 978-1-315-27578-9.
  3. ^ Makanin, G S (1977-02-28). "The Problem of Solvability of Equations in a Free Semigroup". Mathematics of the USSR-Sbornik. 32 (2): 129–198. Bibcode:1977SbMat..32..129M. doi:10.1070/sm1977v032n02abeh002376. ISSN 0025-5734.
  4. ^ Rozenberg, Grzegorz; Salomaa, Arto, eds. (1997). Handbook of Formal Languages. doi:10.1007/978-3-642-59136-5. ISBN 978-3-642-63863-3.
  5. ^ Lothaire, M. (2002). Algebraic Combinatorics on Words. Encyclopedia of Mathematics and its Applications. Cambridge: Cambridge University Press. doi:10.1017/cbo9781107326019. ISBN 978-0-521-81220-7.
  6. ^ Berstel, Jean; Karhumäki, Juhani (February 2003). "Combinatorics on Words – A Tutorial" (PDF). Retrieved 18 October 2024.
  7. Jeż, Artur (2016-01-01). "One-Variable Word Equations in Linear Time". Algorithmica. 74 (1): 1–48. arXiv:1302.3481. doi:10.1007/s00453-014-9931-3. ISSN 1432-0541.
  8. Quine, W. V. (December 1946). "Concatenation as a basis for arithmetic". The Journal of Symbolic Logic. 11 (4): 105–114. doi:10.2307/2268308. ISSN 0022-4812. JSTOR 2268308.
  9. ^ Chmelevskij, Jurij I.; Chmelevskij, Jurij I. (1976). Equations in free semigroups. Proceedings of the Steklov Institute of Mathematics. Providence, RI: American Mathematical Soc. ISBN 978-0-8218-3007-9.
  10. Markov, A. A.; Nagornyĭ, N. M. (1988). The theory of algorithms. Mathematics and its applications (Soviet series) (in English and Russian). Dordrecht; Boston: Kluwer Academic. ISBN 978-90-277-2773-2.
  11. ^ Obono, S. Eyono; Goralcik, P.; Maksimenko, M. (1994). "Efficient solving of the word equations in one variable". In Prívara, Igor; Rovan, Branislav; Ruzička, Peter (eds.). Mathematical Foundations of Computer Science 1994. Lecture Notes in Computer Science. Vol. 841. Berlin, Heidelberg: Springer. pp. 336–341. doi:10.1007/3-540-58338-6_80. ISBN 978-3-540-48663-3.
  12. Berstel, Jean; Perrin, Dominique (2007-04-01). "The origins of combinatorics on words". European Journal of Combinatorics. 28 (3): 996–1022. doi:10.1016/j.ejc.2005.07.019. ISSN 0195-6698.
  13. ^ Kościelski, Antoni; Pacholski, Leszek (1996-07-01). "Complexity of Makanin's algorithm". J. ACM. 43 (4): 670–684. doi:10.1145/234533.234543. ISSN 0004-5411.
  14. Plandowski, W. (1999). "Satisfiability of word equations with constants is in PSPACE". 40th Annual Symposium on Foundations of Computer Science (Cat. No.99CB37039). IEEE Comput. Soc. pp. 495–500. doi:10.1109/SFFCS.1999.814622. ISBN 978-0-7695-0409-4.
  15. ^ Plandowski, Wojciech; Rytter, Wojciech (1998). "Application of Lempel-Ziv encodings to the solution of word equations". In Larsen, Kim G.; Skyum, Sven; Winskel, Glynn (eds.). Automata, Languages and Programming. Lecture Notes in Computer Science. Vol. 1443. Berlin, Heidelberg: Springer. pp. 731–742. doi:10.1007/BFb0055097. ISBN 978-3-540-68681-1.
  16. ^ Karhumäki, Juhani. "Combinatorics of Words" (PDF). Retrieved 18 October 2024.
  17. ^ Lin, Anthony W.; Majumdar, Rupak (2021-10-29). "Quadratic Word Equations with Length Constraints, Counter Systems, and Presburger Arithmetic with Divisibility". Logical Methods in Computer Science. 17 (4). arXiv:2007.15478. doi:10.46298/lmcs-17(4:4)2021. ISSN 1860-5974.
  18. Albert, M. H.; Lawrence, J. (1985-01-01). "A proof of Ehrenfeucht's Conjecture". Theoretical Computer Science. 41: 121–123. doi:10.1016/0304-3975(85)90066-0. ISSN 0304-3975.
  19. ^ Karhumäki, Juhani; Mignosi, Filippo; Plandowski, Wojciech (2000-05-01). "The expressibility of languages and relations by word equations". J. ACM. 47 (3): 483–505. doi:10.1145/337244.337255. ISSN 0004-5411.
  20. Matiyasevich, Yuri (1968). "СВЯЗЬ СИСТЕМ УРАВНЕНИЙ В СЛОВАХ И ДЛИНАХ С 10-Й ПРОБЛЕМОЙ ГИЛЬБЕРТА" (PDF). Math-Net.Ru. Retrieved 19 October 2024.
  21. Day, Joel D.; Ganesh, Vijay; He, Paul; Manea, Florin; Nowotka, Dirk (2018). "The Satisfiability of Word Equations: Decidable and Undecidable Theories". In Potapov, Igor; Reynier, Pierre-Alain (eds.). Reachability Problems. Lecture Notes in Computer Science. Vol. 11123. Cham: Springer International Publishing. pp. 15–29. doi:10.1007/978-3-030-00250-3_2. ISBN 978-3-030-00250-3.
  22. Diekert, Volker; Gutierrez, Claudio; Hagenah, Christian (2005-11-01). "The existential theory of equations with rational constraints in free groups is PSPACE-complete". Information and Computation. 202 (2): 105–140. arXiv:cs/0103018. doi:10.1016/j.ic.2005.04.002. ISSN 0890-5401.
  23. Lin, Anthony W.; Barceló, Pablo (2016-01-11). "String solving with word equations and transducers: towards a logic for analysing mutation XSS". SIGPLAN Not. 51 (1): 123–136. doi:10.1145/2914770.2837641. ISSN 0362-1340.
  24. ^ Amadini, Roberto (2021-11-23). "A Survey on String Constraint Solving". ACM Comput. Surv. 55 (1): 16:1–16:38. arXiv:2002.02376. doi:10.1145/3484198. hdl:11585/850408. ISSN 0360-0300.
  25. Thome, Julian; Shar, Lwin Khin; Bianculli, Domenico; Briand, Lionel (2017). "Search-Driven String Constraint Solving for Vulnerability Detection". 2017 IEEE/ACM 39th International Conference on Software Engineering (ICSE). IEEE. pp. 198–208. doi:10.1109/ICSE.2017.26. ISBN 978-1-5386-3868-2.
  26. Karhumäki, Juhani; Plandowski, Wojciech (1994), "On the Defect Effect of Many Identities in Free Semigroups", Mathematical Aspects of Natural and Formal Languages, World Scientific, pp. 225–232, doi:10.1142/9789814447133_0012, ISBN 978-981-02-1914-7, retrieved 2024-10-19
  27. Ciobanu, Laura; Holt, Derek; Rees, Sarah (2020-03-01). "Equations in groups that are virtually direct products". Journal of Algebra. Special Issue in Memory of Charles Sims. 545: 88–99. arXiv:1806.00244. doi:10.1016/j.jalgebra.2018.10.044. ISSN 0021-8693.
  28. Schiek, Helmut (1973-01-01), Boone, W. W.; Cannonito, F. B.; Lyndon, R. C. (eds.), "Equations Over Groups", Studies in Logic and the Foundations of Mathematics, Word Problems, vol. 71, Elsevier, pp. 563–567, doi:10.1016/s0049-237x(08)71920-7, ISBN 978-0-7204-2271-9, retrieved 2024-10-19
  29. Makanin, G S (1983-06-30). "Equations in a Free Group". Mathematics of the USSR-Izvestiya. 21 (3): 483–546. Bibcode:1983IzMat..21..483M. doi:10.1070/IM1983v021n03ABEH001803. ISSN 0025-5726.
This article needs additional or more specific categories. Please help out by adding categories to it so that it can be listed with similar articles. (October 2024)
Category:
Word equation Add topic