# A spectral assignment approach for the graph isomorphism problem

A spectral assignment approach for the graph isomorphism problem Abstract In this paper, we propose algorithms for the graph isomorphism (GI) problem that are based on the eigendecompositions of the adjacency matrices. The eigenvalues of isomorphic graphs are identical. However, two graphs GA and GB can be isospectral but non-isomorphic. We first construct a GI testing algorithm for friendly graphs and then extend it to unambiguous graphs. We show that isomorphisms can be detected by solving a linear assignment problem (LAP). If the graphs possess repeated eigenvalues, which typically correspond to graph symmetries, finding isomorphisms is much harder. By repeatedly perturbing the adjacency matrices and by using properties of eigenpolytopes, it is possible to break symmetries of the graphs and iteratively assign vertices of GA to vertices of GB, provided that an admissible assignment exists. This heuristic approach can be used to construct a permutation which transforms GA into GB if the graphs are isomorphic. The methods will be illustrated with several guiding examples. 1. Introduction We consider the problem of determining whether two undirected weighted graphs are isomorphic using spectral information. Efficient algorithms for the solution of the graph isomorphism (GI) or graph matching problem are required in a wide variety of different areas such as pattern recognition, object detection, image indexing, face recognition and fingerprint analysis. Furthermore, novel applications such as the analysis of neural networks and social networks require matching of graphs with millions or even billions of vertices [1]. There exists no polynomial-time algorithm to check whether two arbitrary graphs are isomorphic. Interestingly, the GI problem is one of only a few problems for which the complexity class is unknown [13]. Although several attempts have been made to develop polynomial-time algorithms for the GI problem, its complexity is currently unknown. Recently, Babai showed that GI can be solved in quasi-polynomial time [3]. GI belongs to the larger family of isomorphism problems on algebraic structures such as groups or rings that seem to lie between P and NP-complete [2]. Another open question is whether GI can be solved efficiently using quantum computers. The GI problem can be regarded as a non-Abelian hidden subgroup problem (HSP) where the hidden subgroup is the automorphism group of the graph. An efficient solution of the HSP, which is the basis of many quantum algorithms, is only known for certain Abelian groups, whereas the general non-Abelian case remains open [10]. For several special classes of graphs, however, the GI problem is known to be solvable in polynomial time. These classes include, for instance, planar graphs [11] and graphs with bounded degree [20] or bounded eigenvalue multiplicity [4]. For an overview of isomorphism testing methods for these restricted graph classes, we refer to the study by [13]. Since the GI problem is challenging from a computational point of view, one is often forced to use different relaxations [7]. In the study by [1], the proposed GI testing approach for friendly graphs is based on convex relaxations where the set of permutation matrices is replaced by the set of doubly stochastic matrices. First, a quadratic problem is solved to find a doubly stochastic matrix that minimizes the cost function; the solution is then, in a second step, projected onto the set of permutation matrices by solving a linear assignment problem. These results are extended to a larger class of graphs in the study by [7]. In this paper, instead of a convex relaxation of the GI problem, we consider a different relaxation where the set of permutation matrices is replaced by the set of orthogonal matrices. We then construct a linear assignment problem based on the eigenvectors of the graphs. We show that for a certain class of graphs, the solution of the linear assignment problem is also the unique solution of the GI problem. For highly symmetric graphs, we propose an iterative algorithm which is based on spectral information and uses local perturbations of the adjacency matrices to break symmetries and to identify possible assignments. Our algorithm extends the applicability of existing spectral methods for the GI problem to strongly regular graphs with repeated eigenvalues. The paper is organized as follows: Section 2 contains a brief description of different formulations of the GI problem. An overview of spectral properties of graphs is presented in Section 3. Furthermore, we show how these properties can be used to find isomorphisms between graphs with simple spectrum. In Section 4, we propose a novel eigendecomposition-based algorithm to determine whether two highly symmetric graphs are isomorphic. Numerical results for a number of different benchmark problems including strongly regular graphs and isospectral, but non-isomorphic graphs are presented in Section 5. Section 6 lists open questions and possible future work. 2. The GI problem Given two weighted undirected graphs $$\mathcal{G}_{A} = (\mathcal{V}, \mathcal{E}_{A})$$ and $$\mathcal{G}_{B} = (\mathcal{V}, \mathcal{E}_{B})$$ with adjacency matrices A and B, where $$\mathcal{V} = \{ \mathcal{v}_{1}, \dots , \mathcal{v}_{n} \}$$ is the set of vertices and $$\mathcal{E}_{A}$$ and $$\mathcal{E}_{B}$$ are the sets of edges, we want to determine whether these graphs are isomorphic. Definition 2.1 Two graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ are said to be isomorphic—denoted by $$\mathcal{G}_{B} \cong \mathcal{G}_{B}$$—if one of the following equivalent conditions is satisfied: (i) There exists a permutation $$\pi \in \mathscr{S}_{n}$$ such that $$(\mathcal{v}_{i}, \mathcal{v}_{j}) \in \mathcal{E}_{A}{\; \Leftrightarrow \;} (\mathcal{v}_{\pi(i)}, \mathcal{v}_{\pi(j)}) \in \mathcal{E}_{B},$$ where $$\mathscr{S}_{n}$$ denotes the symmetric group of degree n. (ii) There exists a permutation matrix $$P \in \mathscr{P}_{n}$$ such that $$B = P^{T} A P,$$ where $$\mathscr{P}_{n}$$ is the set of all n × n permutation matrices. The relation between the permutation π and the permutation matrix P = (pij) is given by $$p_{ij} = \begin{cases}1, & \textrm{if}\ \pi(i) = j, \\ 0, & \textrm{otherwise}. \end{cases}$$ The GI problem can also be rewritten as a combinatorial optimization problem of the form \begin{align} \min_{P \in \mathscr{P}_{n}} \lVert B - P^{T} A P\rVert_{F}, \end{align} (2.1) where $$\left \lVert .\right \rVert _{F}$$ denotes the Frobenius norm. The graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ are isomorphic if and only if the minimum of the above cost function is zero. Since $${\lVert B - P^{T} A P \rVert}_{{F}}^{{2}} = {\left\lVert B \right\rVert}_{{F}}^{{2}} - 2\textrm{tr}(B^{T} P^{T} A P) + {\left\lVert A \right\rVert}_{{F}}^{{2}},$$ cost function (2.1) is minimized if the term $${tr}( B^{T} P^{T} A P )$$, which is the cost function of the NP-complete quadratic assignment problem, is maximized and vice versa. Definition 2.2 An isomorphism from a graph $$\mathcal{G}$$ to itself is called an automorphism. The set of all automorphisms of a graph forms a group under matrix multiplication, the so-called automorphism group, typically denoted by $$Aut(\mathcal{G})$$. If the graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ are isomorphic, then the number of isomorphisms from $$\mathcal{G}_{A}$$ to $$\mathcal{G}_{B}$$ is identical to the number of automorphisms of $$\mathcal{G}_{A}$$ or $$\mathcal{G}_{B}$$, respectively [27]. Definition 2.3 A graph $$\mathcal{G}$$ is called asymmetric if the automorphism group is trivial. That is, Aut($$\mathcal{G}$$) = {I} for asymmetric graphs. If the automorphism group is non-trivial, we call the graph symmetric. The automorphism groups of strongly regular graphs, for example, can be highly non-trivial. Definition 2.4 A graph is said to be regular (or weakly regular) if each vertex has the same number of neighbors and strongly regular if additionally integers α and β exist such that every pair of vertices vi and vj shares exactly α common neighbors if the vertices vi and vj are adjacent and exactly β common neighbors otherwise [26]. The Frucht graph shown in Fig. 3a, for instance, is weakly regular. An example of a strongly regular graph is the Paley graph shown in Fig. 5a. The solution of the minimization problem (2.1) is not unique if the graphs possess non-trivial symmetries; for isomorphic graphs, the number of feasible solutions corresponds to the number of isomorphisms. Whether symmetries exist or not, however, is in general difficult to determine a priori. Although symmetries typically correspond to repeated eigenvalues, the correspondence is not exact [19]. Examples of asymmetric graphs with repeated eigenvalues or symmetric graphs with simple spectrum (see also Fig. 3c) can be found in the study by [7], for example. In what follows, we will characterize graphs using spectral properties. 3. Graphs with distinct eigenvalues An isomorphism testing algorithm for graphs with n distinct eigenvalues developed by Leighton and Miller is presented in the studies by [15] and [27]. The method determines the isomorphism by breaking vertices that are not equivalent into different classes. Based on the entries of the eigenvectors, these classes are refined until either an isomorphism is found or the graphs are shown to be non-isomorphic. Another approach that utilizes spectral information is presented in the study by [1], where so-called friendly graphs (defined below) are considered. The set of permutation matrices is replaced by the set of doubly stochastic matrices. After solving the resulting quadratic program, a linear assignment problem is solved to project the doubly stochastic matrix back onto the set of permutation matrices. [1] prove that for friendly isomorphic graphs, the relaxed problem is equivalent to the original GI problem. We will use a different approach for GI testing that relies on a relaxation to the manifold of orthogonal matrices and—for friendly graphs—requires only the solution of a single linear assignment problem. An extension of this method for graphs with repeated eigenvalues will be proposed in Section 4. 3.1. The two-sided orthogonal Procrustes problem Let $$\mathscr{O}_{n} = \{ P \in{\mathbb{R}}^{n \times n} \mid P^{T} P = I\}$$ denote the set of all orthogonal matrices. Note that the set of permutation matrices $$\mathscr{P}_{n}$$ is a subset of $$\mathscr{O}_{n}$$. Provided that the matrices A and B are symmetric1—we consider only undirected graphs—the solution of the relaxed problem \begin{align} \min_{P \in \mathscr{O}_{n}}\lVert B - P^{T} A P\rVert_{F}, \end{align} (3.1) which is called two-sided orthogonal Procrustes problem[9, 25], can be computed analytically, provided that both A and B have distinct eigenvalues. This result is captured in the following theorem: Theorem 3.1 Given two symmetric matrices A and B with distinct eigenvalues, let $$A = V_{A} \Lambda _{A} {V_{A}^{T}}$$ and $$B = V_{B} \Lambda _{B} {V_{B}^{T}}$$ be the eigendecompositions, with $$\Lambda _{A} =\textrm{diag}\big (\lambda _{A}^{(1)}, \dots , \lambda _{A}^{(n)}\big )$$, $$\Lambda _{B} =\textrm{diag}\big (\lambda _{B}^{(1)}, \dots , \lambda _{B}^{(n)}\big )$$ and $$\lambda _{A}^{(1)} < \dots < \lambda _{A}^{(n)}$$ as well as $$\lambda _{B}^{(1)} < \dots < \lambda _{B}^{(n)}$$. Then the orthogonal matrix P* which minimizes (3.1) is given by $$P^{\ast} = V_{A} S {V_{B}^{T}},$$ where $$S =\textrm{diag}(\pm 1, \dots , \pm 1)$$. A proof of the above theorem can be found in the study by [25]. The eigenvalues and corresponding eigenvectors have to be sorted both in either increasing or decreasing order. Note that there are 2n different orthogonal matrices which minimize the cost function. Lemma 3.2 If the graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ are isomorphic, then one of the 2n solutions is the permutation matrix P that minimizes (2.1). Proof. Since all eigenvalues are distinct, the eigenvectors of B are—up to the signs2—permutations of the eigenvectors of A, i.e. $$V_{B} = P V_{A} \hat{S}$$, where $$P \in \mathscr{P}_{n}$$ permutes the rows and $$\hat{S} =\textrm{diag}(\pm 1, \dots , \pm 1)$$ flips the signs of the eigenvectors. If we now choose $$S = \hat{S}$$, then $$P^{\ast } = P V_{A} \hat{S}^{2} {V_{A}^{T}}$$. Using $$\hat{S}^{2} = I$$ and the orthogonality of VA, we obtain $$P^{\ast } = P \in \mathscr{P}_{n}$$. Note that we are, however, not searching over the 2n solutions. Let now c be the cost of the optimal solution of the relaxed problem, i.e. $$c = \min_{P \in \mathscr{O}_{n}} \lVert B - P^{T} A P \rVert_{F} = \lVert B - P^{*T} A P^{\ast}\rVert_{F} = \left\lVert \Lambda_{B} - \Lambda_{A} \right\rVert_{F}\!.$$ As $$\mathscr{P}_{n} \subset \mathscr{O}_{n}$$, the graphs cannot be isomorphic if c≠0. If, on the other hand, c = 0, this implies that the eigenvalues of A and B are identical and the graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ are isospectral but not necessarily isomorphic. In addition to the eigenvalues, the eigenvectors of the graphs can be used for isomorphism testing as shown in the following example. Example 3.3 An example of isospectral graphs, taken from the study by [24], is shown in Fig. 1. Setting a = 1, b = 2 and c = 3, the eigenvectors of the graphs belonging to the largest eigenvalue $$\lambda _{A}^{(6)} = \lambda _{B}^{(6)} = 5.167$$ are \begin{align*} v_{A}^{(6)} =&\, [0.380, 0.092, 0.157, 0.655, 0.477, 0.407]^{T}, \\ v_{B}^{(6)} =&\, [0.222, 0.068, 0.352, 0.575, 0.353, 0.606]^{T}. \end{align*} Since the entries of the eigenvectors are different, $$v_{B}^{(6)}$$ cannot be written as a permutation of $$v_{A}^{(6)}$$, implying that $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ are not isomorphic. △ Fig. 1. View largeDownload slide Isospectral graphs. Fig. 1. View largeDownload slide Isospectral graphs. Provided that the entries of the eigenvectors are distinct, this simple test can be used for GI testing. Here, we compared the entries of the eigenvectors corresponding to the largest eigenvalue since the largest eigenvalue of the adjacency matrix of a connected graph always has multiplicity one [19]. Furthermore, all entries of the corresponding eigenvector are strictly positive. Note that in general the normalized eigenvectors are only determined up to the sign. Thus, two comparisons might be required. If the absolute values of all entries of an eigenvector are different—[27] calls such an eigenvector helpful—this gives us a canonical labeling of the vertices and the GI problem can be solved easily. 3.2. Friendly and unambiguous graphs In the study by [1], friendly graphs are considered. In contrast to asymmetry, friendliness can be easily verified. Definition 3.4 Let 𝟙 $$\in{\mathbb{R}}^{n}$$ denote the vector of ones. A graph $$\mathcal{G}_{A}$$ with adjacency matrix A is called friendly if A has distinct eigenvalues and all eigenvectors $$v_{A}^{(k)}$$ satisfy 𝟙$$^{T} v_{A}^{(k)} \ne 0$$. As a result, it is possible to make the signs of corresponding eigenvectors consistent.3 This corresponds to finding the sign matrix S in Lemma 3.2. Thus, the permutation matrix which solves the GI problem can be computed directly. We will propose a different approach that relies on the solution of a linear assignment problem and will be generalized later on. For friendly graphs, we obtain: Lemma 3.5 Every friendly graph GA is asymmetric [1]. Proof. Let $$A = V_{A} \Lambda _{A} {V_{A}^{T}}$$ be the eigendecomposition. Assuming there exists $$P \in \mathscr{P}_{n}$$ with A = PTAP, we obtain another eigendecomposition A = (PTVA)ΛA(PTVA)T. Since the eigenvectors are determined up to the signs, VA = PTVAS, where $$S =\textrm{diag}(\pm 1, \dots , \pm 1)$$. Thus, 𝟙TVA = 𝟙TPTVAS = 𝟙TVAS. Each entry of the vector 𝟙TVA must be non-zero since the graph is friendly. Thus, the equation can only be satisfied if S = I. However, S = I implies VA = PTVA and hence P = I. As a result, the automorphism group only contains the identity matrix and the graph is asymmetric. The converse is not true, the Frucht graph shown in Fig. 3a, for instance, is asymmetric but not friendly. Furthermore, there are asymmetric graphs with repeated eigenvalues. A Venn diagram of different graph classes is shown in Fig. 2 (reproduced from the study by [7], examples of graphs in each of these categories can also be found there). Even if eigenvectors are not friendly, it is often possible to make the signs of two corresponding eigenvectors consistent. Fig. 2. View largeDownload slide Different classes of graphs based on the study by [7]. Friendly graphs are given by the intersection of graphs with simple spectrum and graphs whose eigenvectors are non-orthogonal to 𝟙. Fig. 2. View largeDownload slide Different classes of graphs based on the study by [7]. Friendly graphs are given by the intersection of graphs with simple spectrum and graphs whose eigenvectors are non-orthogonal to 𝟙. Definition 3.6 We call an eigenvector v ambiguous if v and −v have exactly the same entries. That is, there exists $$P \in \mathscr{P}_{n}$$ with v = −Pv. A graph without ambiguous eigenvectors is called unambiguous. This property can be easily verified by sorting the entries. Note that ambiguity implies unfriendliness since v = −Pv results in 𝟙$$^{T}_{v} =-$$ 𝟙$$^{T}_{v}$$ and thus 𝟙$$^{T}_{v} = 0$$. The class of graphs whose eigenvectors are not ambiguous is thus larger than the class of friendly graphs. 3.3. A spectral assignment approach for friendly and unambiguous graphs We now want to construct a linear assignment problem for friendly graphs that solves the GI problem. In our approach, the cost of assigning vertices of $$\mathcal{G}_{A}$$ to vertices of $$\mathcal{G}_{B}$$ will be based on the eigenvectors. Definition 3.7 Let $$V, W \in{\mathbb{R}}^{n \times m}$$ be two matrices, then the cost of assigning V to W is defined to be $$C(V, W) = (c_{ij}) \in{\mathbb{R}}^{n \times n}$$, with $$c_{ij} = \sum_{k=1}^{m} \left\lvert v_{i}^{(k)} - w_{j}^{(k)} \right\rvert,$$ where v(k) and w(k) are the column vectors of V and W, respectively. Lemma 3.8 Let V, W and C = C(V, W) be as in the above definition. Define \begin{align} c = \min_{P \in \mathscr{P}_{n}}{tr} ( P^{T} C ) \end{align} (3.2) to be the minimal cost of the linear assignment problem. Then V = PW for $$P \in \mathscr{P}_{n}$$ if and only if c = 0. Proof. We will first show that this result holds for vectors v and w. Assume that v = Pw for $$P \in \mathscr{P}_{n}$$. It follows that vi = wπ(i) and thus $$c_{i, \pi (i)} = \left \lvert v_{i} - w_{\pi (i)} \right \rvert = 0$$. Furthermore, $${tr} (P^{T} C) = \sum_{i,j=1}^{n} p_{ij} c_{ij} = \sum_{i=1}^{n} c_{i, \pi(i)} = 0.$$ For the other direction, assume that c = 0 and that the corresponding permutation matrix is $$\hat{P}$$. Then $$c_{i, \hat{\pi }(i)} = \lvert v_{i} - w_{\hat{\pi }(i)}\rvert = 0$$ and consequently $$v = \hat{P} w$$. For matrices, the proof is almost identical. Assume that V = PW. Then v(k) = Pw(k) for all column vectors. Thus, with the first part, we obtain ci, π(i) = 0 and $${tr} \left ( P^{T} C \right ) = 0$$. The other direction follows in the same way. The linear assignment problem can be solved in O(n3) using the Hungarian method [5, 14]. For two friendly graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ with adjacency matrices A and B and eigendecompositions as described in Theorem 3.1, we define $$C(\mathcal{G}_{A}, \mathcal{G}_{B})=C(V_{A}, V_{B})$$. Note that in the following theorem we assume that the signs of the corresponding eigenvectors $$v_{A}^{(k)}$$ and $$v_{B}^{(k)}$$ are consistent. This is possible as the sum of the entries of each eigenvector is non-zero. Theorem 3.9 Let $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ be friendly graphs. Define $$C=C(\mathcal{G}_{A}, \mathcal{G}_{B})$$ and c as in (3.2). Then $$\mathcal{G}_{A}\cong \mathcal{G}_{B}$$ if and only if ΛA =ΛB and c = 0. The solution P of the linear assignment problem is then a solution of GI. Proof. Assume that $$\mathcal{G}_{A}\cong \mathcal{G}_{B}$$ and thus $$A = V_{A} \Lambda _{A} {V_{A}^{T}} = P B P^{T} = (P V_{B}) \Lambda _{B} (P V_{B})^{T}$$. Using Lemma 3.8, we obtain $$\textrm{tr} \left ( P^{T} C \right ) = 0$$. If, on the other hand, c = 0, Lemma 3.8 implies that VA = PVB and thus $$\lVert B - P^{T} A P \rVert_{F} = \left\lVert V_{B} \Lambda_{B} {V_{B}^{T}} - P^{T} V_{A} \Lambda_{A} {V_{A}^{T}} P\right\rVert_{F} = \left\lVert \Lambda_{B} - \Lambda_{A}\right\rVert_{F} = 0.$$ For friendly graphs, we propose the GI testing approach described in Algorithm 1. If an eigenvector is ambiguous, we will use the absolute values of the entries for the computation of the cost matrix. Algorithm 1 Graph isomorphism testing for friendly graphs. Example 3.10 Let us illustrate the definition of ambiguity:4 (i) The vectors v = [1, 2, 0, −3]T and w = [0, −1, −2, 3]T are unfriendly, but not ambiguous, and v can only be assigned to − w using π = (1 2 3). (ii) The vectors v = [1, 2, −1, −2]T and w = [−2, −1, 1, 2]T, on the other hand, are ambiguous since v can be assigned to w using π = (1 3 2 4) or to − w using π = (1 2). Taking absolute values leads to two spurious solutions, given by π = (1 3 2) and π = (1 2 4). △ If the eigenvectors are not ambiguous, we can make the signs consistent (e.g. by sorting the entries) and apply Algorithm 1 in the same way by replacing only step 2. Note that we only assumed that the signs of the eigenvectors are consistent in Theorem 3.9, the friendliness property was not used explicitly. Example 3.11 Let us consider different graph types to illustrate the idea behind the assignment approach: (i) Given the Frucht graph shown in Fig. 3a and the permutation of the graph shown in Fig. 3b, the resulting cost matrix C is displayed in Fig. 3g. The solution of the linear assignment problem is $$\pi = \left[\begin{array}{cccccccccccc} 1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 & 9 & 10 & 11 & 12 \\ 4 & 5 & 6 & 1 & 2 & 3 & 9 & 8 & 7 & 12 & 11 & 10 \end{array}\right]$$ and the cost of the assignment is zero. The graph has simple spectrum, is asymmetric, regular and thus not friendly.5 However, only one eigenvector is ambiguous. Even without taking into account this eigenvector, the algorithm successfully computes the correct permutation matrix. (ii) An example of a graph with non-trivial automorphism group but simple spectrum, taken from the study by [7], and a random permutation are shown in Fig. 3c–d, the corresponding cost matrix C is depicted in Fig. 3h. Here, two eigenvectors are ambiguous and the solution of the LAP is not unique since v1 could be assigned to v2 or v4 and v2 to v1 or v3. These assignments, however, are not independent, as soon as one of the first four vertices is assigned, the others follow automatically. Feasible solutions are $$\pi_{1} = \left[\begin{array}{cccccccc} 1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 \\ 4 & 3 & 2 & 1 & 6 & 5 & 7 & 8 \end{array}\right], \quad \pi_{2} = \left[\begin{array}{cccccccc} 1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 \\ 2 & 1 & 4 & 3 & 6 & 5 & 7 & 8 \end{array}\right].$$ If we do not use the absolute values of the ambiguous eigenvectors for the computation of the cost matrix, then there are four possible combinations: we can assign $$v_{A}^{(3)}$$ to $$\pm v_{B}^{(3)}$$ and $$v_{A}^{(6)}$$ to $$\pm v_{B}^{(6)}$$. One combination will result in π1, one in π2, the remaining two lead to non-zero assignment costs. Thus, taking absolute values prevents conflicting information about possible assignments, but also results in spurious solutions (cf. Example 3.10). (iii) The Facebook social circles graph [21], available through the SNAP website [16], consists of 4039 vertices and 88234 edges. The adjacency matrices of the original and permuted graph are shown in Fig. 3e–f. The cluster structure of the graph is clearly visible in Fig. 3e. Numerically, the graph has a couple of repeated eigenvalues (around λ = −1, λ = 0 and λ = 1) and we neglect the corresponding eigenvectors. Nevertheless, Algorithm 1 returns a valid assignment that solves the GI problem. △ Fig. 3. View largeDownload slide (a–b) Original and permuted Frucht graph. (c–d) Original and permuted house graph. (e–f) Original and permuted adjacency matrix of the Facebook graph. (g–h) Cost matrix for Frucht graph and house graph. White entries represent possible assignments with cost cij < ε. Fig. 3. View largeDownload slide (a–b) Original and permuted Frucht graph. (c–d) Original and permuted house graph. (e–f) Original and permuted adjacency matrix of the Facebook graph. (g–h) Cost matrix for Frucht graph and house graph. White entries represent possible assignments with cost cij < ε. The examples show that even with incomplete information it is possible to compute valid solutions of the GI problem. If the solution is not unique, constructing the cost matrix reduces the search space significantly since only zero-cost entries need to be taken into account. Furthermore, the examples illustrate the difficulties arising from ambiguous eigenvectors. 4. Graphs with repeated eigenvalues If the graphs possess repeated eigenvalues, finding isomorphisms is much harder. The eigenvectors belonging to repeated eigenvalues are unique only up to basis rotations, and we cannot construct a linear assignment problem by comparing corresponding eigenvectors anymore. As mentioned above, repeated eigenvalues typically correspond to graph symmetries. It can be shown that strongly regular graphs, for instance, possess at most three distinct eigenvalues [19]. Definition 4.1 Given a graph $$\mathcal{G}_{A}$$ with adjacency matrix A. Let $$\lambda _{A} = \big [ \lambda _{A}^{(1)}, \dots , \lambda _{A}^{(m)} \big ]$$ be the eigenvalues of the graph $$\mathcal{G}_{A}$$ with multiplicities $$\mu _{A} = \big [ \mu _{A}^{(1)}, \dots , \mu _{A}^{(m)} \big ]$$, i.e. $$\sum _{k=1}^{m} \mu _{A}^{(k)} = n$$. We then partition VA into $$V_{A} = \big [ V_{A}^{(1)}, \dots , V_{A}^{(m)} \big ]$$, with $$V_{A}^{(k)} \in{\mathbb{R}}^{n \times \mu _{A}^{(k)}}$$. That is, $$V_{A}^{(k)}$$ is either the eigenvector belonging to the eigenvalue $$\lambda _{A}^{(k)}$$ or the matrix whose columns form an orthogonal basis of the eigenspace. Example 4.2 We will use the following guiding examples to illustrate the proposed isomorphism testing approach for graphs with repeated eigenvalues: (i) The eigenvalues of the cycle graph shown in Fig. 4a are λA = [−2, −1, 1, 2] with multiplicities μA = [1, 2, 2, 1]. The automorphism group of the cycle graph is D6 and thus $$\left \lvert \textrm{Aut}(\mathcal{G})\right \rvert = 12$$. (ii) The eigenvalues of the strongly regular Paley graph shown in Fig. 5a are $$\lambda _{A} = \big [ \frac{-1 - \sqrt{17}}{2}, \frac{-1 + \sqrt{17}}{2}, 8 \big ]$$ with multiplicities $$\mu _{A} = \big [ 8, 8, 1 \big ]$$. Here, Aut($$\mathcal{G}$$) ≅ S5. Thus, the graph possesses $$\left \lvert \textrm{Aut}(\mathcal{G})\right \rvert = 5! = 120$$ automorphisms. △ Fig. 4. View largeDownload slide GI testing procedure for the cycle graph. The various colors represent self-loops with different weights. The bottom row shows the structure of the corresponding cost matrices C. White entries represent possible assignments with cost cij < ε. After two perturbations of the graphs, the solution of the LAP is unique. Fig. 4. View largeDownload slide GI testing procedure for the cycle graph. The various colors represent self-loops with different weights. The bottom row shows the structure of the corresponding cost matrices C. White entries represent possible assignments with cost cij < ε. After two perturbations of the graphs, the solution of the LAP is unique. Fig. 5. View largeDownload slide (a) Strongly regular Paley graph. (b–e) Structure of the cost matrices C after 1, 2, 3 and 4 successful assignments. White entries represent assignments with cij < ε. After four perturbations, the solution is unique. Fig. 5. View largeDownload slide (a) Strongly regular Paley graph. (b–e) Structure of the cost matrices C after 1, 2, 3 and 4 successful assignments. White entries represent assignments with cij < ε. After four perturbations, the solution is unique. 4.1. Eigenpolytopes For the graphs in the previous example, it is not possible to apply Algorithm 1 directly. If we compare only eigenvectors belonging to distinct eigenvalues, this leads to C = 0. That is, any permutation matrix would be a feasible solution of the linear assignment problem. Therefore, we need to exploit additional information encoded in matrices representing orthogonal projections onto the eigenspace of repeated eigenvalues. Definition 4.3 Let $$\lambda _{A}^{(k)}$$ be a repeated eigenvalue of graph $$\mathcal{G}_{A}$$. For a vertex vi, define $$V_{A}^{(k)}(\mathcal{v}_{i})$$ to be the ith row of $$V_{A}^{(k)}$$. The convex hull of all vectors $$V_{A}^{(k)}(\mathcal{v}_{i})$$, $$i = 1, \dots , n$$, is called the eigenpolytope of the graph belonging to the eigenvalue $$\lambda _{A}^{(k)}$$. The row vectors $$V_{A}^{(k)}(\mathcal{v}_{i})$$ clearly depend on the orthogonal basis chosen for the eigenspace, but the scalar product is independent of the choice of basis [8]. The matrix $$E_{A}^{(k)} = V_{A}^{(k)} \left(V_{A}^{(k)}\right)^{T},$$ i.e. $$(E_{A}^{(k)})_{ij} = \big \langle V_{A}^{(k)}(\mathcal{v}_{i}), \, V_{A}^{(k)}(\mathcal{v}_{j}) \big \rangle$$, represents the orthogonal projection onto the column space of $$V_{A}^{(k)}$$ and is an invariant of the eigenspace that does not depend on the orthogonal basis chosen for $$V_{A}^{(k)}$$, see also the study by [6]. Thus, $$E_{B}^{(k)} = P^{T} E_{A}^{(k)} P$$, which in itself can again be interpreted as a GI problem. For a detailed description of the relation between a graph and the geometry of its eigenpolytopes, we refer to the studies by [6] and [8]. We now exploit properties of the matrices $$E_{A}^{(k)}$$ to identify isomorphisms. In what follows, we will show that by comparing eigenvectors and eigenpolytopes, it is possible to compute isomorphisms of strongly regular graphs such as the Paley graph. 4.2. A spectral assignment approach for graphs with symmetries As described above, repeated eigenvalues complicate GI testing. Our heuristic approach is based on finding local perturbations of the adjacency matrices A and B that break symmetries without changing the isomorphism. Let us illustrate the basic idea with a simple example. Example 4.4 Let us consider the cycle graphs shown in Fig. 4a. In order to find an assignment for vertices of $$\mathcal{G}_{A}$$ to vertices of $$\mathcal{G}_{B}$$, we perturb the adjacency matrices A and B. If we add a self-loop to vertex v1 of $$\mathcal{G}_{A}$$ and v1 of $$\mathcal{G}_{B}$$, the two graphs remain isomorphic.6 Thus, we assign vertex v1 to vertex v1 of $$\mathcal{G}_{B}$$. The updated graphs are shown in Fig. 4b, the marked vertices denote self-loops with weight 1. Now, we try to assign vertex v2 of $$\mathcal{G}_{A}$$ to a vertex of $$\mathcal{G}_{B}$$. Since we have broken the cyclic symmetry, there are now only two possible assignments (due to the remaining reflection symmetry). Vertex v2 of $$\mathcal{G}_{A}$$ could be either assigned to vertex v5 or v6 of $$\mathcal{G}_{B}$$. Adding self-loops with weight 2 to vertex v2 of $$\mathcal{G}_{A}$$ and vertex v5 of $$\mathcal{G}_{B}$$—shown in Fig. 4c—the resulting graph is friendly and thus asymmetric. The permutation matrix P could be computed using Algorithm 1. Alternatively, the procedure described above can be repeated until a valid assignment for all vertices is found. The resulting graphs are shown in Fig. 4d. △ Let us formalize the above procedure. We start with the original adjacency matrices A and B and construct cost matrices C(k), $$k = 1, \dots , m$$, as follows. For simple eigenvectors, we use the cost matrix from Definition 3.7, i.e. $$C^{(k)} = C(V_{A}^{(k)}, V_{B}^{(k)})$$. For repeated eigenvalues, we compute the projection matrices $$E_{A}^{(k)}$$ and $$E_{B}^{(k)}\!,$$ and check for each row i of $$E_{A}^{(k)}$$ whether it can be written as a permutation of row j of $$E_{B}^{(k)}$$ by comparing the sorted vectors.7 Definition 4.5 Let $$E_{A}^{(k)}(\mathcal{v}_{i})$$ and $$E_{B}^{(k)}(\mathcal{v}_{j})$$ be the ith and jth row of the matrices $$E_{A}^{(k)}$$ and $$E_{B}^{(k)}$$, respectively, and let $$s :{\mathbb{R}}^{n} \to{\mathbb{R}}^{n}$$ be a function that sorts the entries of a vector. For repeated eigenvalues, we define $$C^{(k)} = (c_{ij}^{(k)})$$, with $$c_{ij}^{(k)} = \big \lVert s \big ( E_{A}^{(k)}(\mathcal{v}_{i}) \big ) - s \big ( E_{B}^{(k)}(\mathcal{v}_{j}) \big ) \big \rVert _{F}$$. Note that this is only a heuristic approach and might lead to wrong assignments. However, utilizing properties of the eigenpolytopes improves the efficiency of the algorithm significantly and backtracking is required only in exceptional cases. For two graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$, the cost matrix is then defined as $$C(\mathcal{G}_{A}, \mathcal{G}_{B}) = \sum_{k=1}^{m} C^{(k)}.$$ The entries cij represent the costs of assigning vi of $$\mathcal{G}_{A}$$ to vj of $$\mathcal{G}_{B}$$. To determine possible assignments, we again compute $$C=C(\mathcal{G}_{A}, \mathcal{G}_{B})$$ and solve the resulting linear assignment problem $$c = \min_{P \in \mathscr{P}_{n}}{tr} (P^{T} C).$$ For the unperturbed cycle graph and Paley graph, the resulting cost matrices are zero, which implies that any vertex of $$\mathcal{G}_{A}$$ can initially be assigned to any vertex of $$\mathcal{G}_{B}$$. However, these assignments cannot be chosen independently. Thus, we assign vertices iteratively using local perturbations of the graphs as described in Example 4.4. After perturbing the graphs, symmetries are destroyed and the number of non-zero entries decreases until only one feasible solution remains. In order to perturb the adjacency matrices A and B, and hence the eigenvalues and eigenvectors, we use single-entry matrices representing self-loops with different weights w. Definition 4.6 Define $$D_{i}(w) =\textrm{diag}(d_{1}, \dots , d_{n})$$ to be the diagonal matrix with $$d_{j} = \begin{cases} w, & \textrm{if } j = i, \\ 0, & \textrm{otherwise}. \end{cases}$$ The proposed method for graphs with repeated eigenvalues is described in Algorithm 2. The algorithm can be stopped if the solution of the LAP is unique. The number of iterations required to obtain a unique solution depends on the order in which the vertices are perturbed. In the description of the algorithm, we have not included backtracking techniques. Backtracking is needed if a previously found assignment does not result in a correct permutation. We then delete the previous assignment and try to find a different assignment for the current vertex. Backtracking is required only for certain graph types as illustrated in Section 5. Algorithm 2 GI testing for graphs with repeated eigenvalues Example 4.7 Let us consider again the graphs from Example 4.2: (i) For the cycle graph, the cost matrices C that result in successful assignments are shown in Fig. 4a–d. Without perturbing the adjacency matrices, each vertex of GA can be assigned to each vertex of GB and the cost matrix C is zero. After one perturbation, all eigenvalues are distinct, but due to the remaining reflection symmetry the solution is not unique and there are still two ambiguous eigenvectors (see Example 3.11). After two iterations, the solution is unique and the resulting permutation is given by $$\pi = \left[\begin{array}{cccccc} 1 & 2 & 3 & 4 & 5 & 6 \\ 1 & 5 & 3 & 4 & 2 & 6 \end{array}\right].$$ (ii) For the Paley graph, the cost matrices after 1, 2, 3 and 4 successful assignments are shown in Fig. 5b–e. After the fourth perturbation, the solution is unique and $$\pi = \left[\begin{array}{ccccccccccccccccc} 1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 & 9 & 10 & 11 & 12 & 13 & 14 & 15 & 16 & 17\\ 1 & 6 & 15 & 3 & 11 & 7 & 17 & 12 & 9 & 8 & 4 & 2 & 5 & 16 & 10 & 13 & 14\end{array}\right].$$ △ ▵ These examples demonstrate that the proposed method successfully computes isomorphisms of graphs with repeated eigenvalues. 5. Benchmark problems In this section, we will present numerical results for benchmark problems downloaded from the studies by [28] and [18]. For our computations, we used Matlab and an error tolerance ε = 10−6. That is, two eigenvalues of a matrix are defined to be identical if the difference is less than ε. Furthermore, an assignment is accepted if the cost c of the solution of the LAP is less than ε. For all graphs downloaded from the study by [28], the algorithm returned correct results without backtracking. Results for larger benchmark problems used by conauto are presented in Table 1. For each benchmark problem, we run the proposed algorithm 100 times using different randomly generated permutations of the original graph. Here, n is the number of vertices, noBT the number of runs where no backtracking was required and BT the number of runs where backtracking was required to find an isomorphism. The column steps describes the average number of backtracking steps needed to find an isomorphism and the last column lists the average runtime in seconds. The efficiency of the algorithm could be easily improved by using C or C++. The results show that the algorithm returns correct results for most of the benchmark problems without backtracking. Only the Steiner triple system graph (1) and the union of strongly regular graphs (16) require backtracking for almost all test cases. Table 1 Test results for various benchmark graphs Type # Name n noBT BT steps (avg) time [s] Steiner triple system graphs 1 sts-19_57 57 5 95 37.91 11.051 Latin square graphs 2 latin-3_9 9 100 0 0.00 0.004 (prime order) 3 latin-5_25 25 100 0 0.00 0.017 4 latin-7_49 49 98 2 1.00 0.141 Latin square graphs 5 latin-2_4 4 100 0 0.00 0.001 (prime power order) 6 latin-4_16 16 100 0 0.00 0.008 7 latin-6_36 36 74 26 2.31 0.086 Paley graphs (prime order) 8 paley-prime_13 13 89 11 1.09 0.006 9 paley-prime_29 29 100 0 0.00 0.031 Paley graphs (prime power order) 10 paley-power_9 9 100 0 0.00 0.002 11 paley-power_25 25 100 0 0.00 0.020 Lattice graphs 12 lattice(4)_16 16 100 0 0.00 0.010 13 lattice(6)_36 36 100 0 0.00 0.067 Triangular graphs 14 triangular(7)_21 21 100 0 0.00 0.013 15 triangular(10)_45 45 100 0 0.00 0.097 Unions of strongly regular graphs 16 usr(1)_29-1 29 11 89 13.49 0.392 Clique-conected cubic 17 chh_cc(1-1)_22-1 22 100 0 0.00 0.006 hypo-Hamiltonian graphs 18 chh_cc(2-1)_44-1 44 100 0 0.00 0.097 Non-disjoint unions of 19 tnn(1)_26-1 26 100 0 0.00 0.066 undirected tripartite graphs 20 tnn(2)_52-1 52 100 0 0.00 0.700 Random graphs 21 iso_r01N_s20 20 100 0 0.00 0.002 22 iso_r01N_s40 40 100 0 0.00 0.010 Type # Name n noBT BT steps (avg) time [s] Steiner triple system graphs 1 sts-19_57 57 5 95 37.91 11.051 Latin square graphs 2 latin-3_9 9 100 0 0.00 0.004 (prime order) 3 latin-5_25 25 100 0 0.00 0.017 4 latin-7_49 49 98 2 1.00 0.141 Latin square graphs 5 latin-2_4 4 100 0 0.00 0.001 (prime power order) 6 latin-4_16 16 100 0 0.00 0.008 7 latin-6_36 36 74 26 2.31 0.086 Paley graphs (prime order) 8 paley-prime_13 13 89 11 1.09 0.006 9 paley-prime_29 29 100 0 0.00 0.031 Paley graphs (prime power order) 10 paley-power_9 9 100 0 0.00 0.002 11 paley-power_25 25 100 0 0.00 0.020 Lattice graphs 12 lattice(4)_16 16 100 0 0.00 0.010 13 lattice(6)_36 36 100 0 0.00 0.067 Triangular graphs 14 triangular(7)_21 21 100 0 0.00 0.013 15 triangular(10)_45 45 100 0 0.00 0.097 Unions of strongly regular graphs 16 usr(1)_29-1 29 11 89 13.49 0.392 Clique-conected cubic 17 chh_cc(1-1)_22-1 22 100 0 0.00 0.006 hypo-Hamiltonian graphs 18 chh_cc(2-1)_44-1 44 100 0 0.00 0.097 Non-disjoint unions of 19 tnn(1)_26-1 26 100 0 0.00 0.066 undirected tripartite graphs 20 tnn(2)_52-1 52 100 0 0.00 0.700 Random graphs 21 iso_r01N_s20 20 100 0 0.00 0.002 22 iso_r01N_s40 40 100 0 0.00 0.010 Table 1 Test results for various benchmark graphs Type # Name n noBT BT steps (avg) time [s] Steiner triple system graphs 1 sts-19_57 57 5 95 37.91 11.051 Latin square graphs 2 latin-3_9 9 100 0 0.00 0.004 (prime order) 3 latin-5_25 25 100 0 0.00 0.017 4 latin-7_49 49 98 2 1.00 0.141 Latin square graphs 5 latin-2_4 4 100 0 0.00 0.001 (prime power order) 6 latin-4_16 16 100 0 0.00 0.008 7 latin-6_36 36 74 26 2.31 0.086 Paley graphs (prime order) 8 paley-prime_13 13 89 11 1.09 0.006 9 paley-prime_29 29 100 0 0.00 0.031 Paley graphs (prime power order) 10 paley-power_9 9 100 0 0.00 0.002 11 paley-power_25 25 100 0 0.00 0.020 Lattice graphs 12 lattice(4)_16 16 100 0 0.00 0.010 13 lattice(6)_36 36 100 0 0.00 0.067 Triangular graphs 14 triangular(7)_21 21 100 0 0.00 0.013 15 triangular(10)_45 45 100 0 0.00 0.097 Unions of strongly regular graphs 16 usr(1)_29-1 29 11 89 13.49 0.392 Clique-conected cubic 17 chh_cc(1-1)_22-1 22 100 0 0.00 0.006 hypo-Hamiltonian graphs 18 chh_cc(2-1)_44-1 44 100 0 0.00 0.097 Non-disjoint unions of 19 tnn(1)_26-1 26 100 0 0.00 0.066 undirected tripartite graphs 20 tnn(2)_52-1 52 100 0 0.00 0.700 Random graphs 21 iso_r01N_s20 20 100 0 0.00 0.002 22 iso_r01N_s40 40 100 0 0.00 0.010 Type # Name n noBT BT steps (avg) time [s] Steiner triple system graphs 1 sts-19_57 57 5 95 37.91 11.051 Latin square graphs 2 latin-3_9 9 100 0 0.00 0.004 (prime order) 3 latin-5_25 25 100 0 0.00 0.017 4 latin-7_49 49 98 2 1.00 0.141 Latin square graphs 5 latin-2_4 4 100 0 0.00 0.001 (prime power order) 6 latin-4_16 16 100 0 0.00 0.008 7 latin-6_36 36 74 26 2.31 0.086 Paley graphs (prime order) 8 paley-prime_13 13 89 11 1.09 0.006 9 paley-prime_29 29 100 0 0.00 0.031 Paley graphs (prime power order) 10 paley-power_9 9 100 0 0.00 0.002 11 paley-power_25 25 100 0 0.00 0.020 Lattice graphs 12 lattice(4)_16 16 100 0 0.00 0.010 13 lattice(6)_36 36 100 0 0.00 0.067 Triangular graphs 14 triangular(7)_21 21 100 0 0.00 0.013 15 triangular(10)_45 45 100 0 0.00 0.097 Unions of strongly regular graphs 16 usr(1)_29-1 29 11 89 13.49 0.392 Clique-conected cubic 17 chh_cc(1-1)_22-1 22 100 0 0.00 0.006 hypo-Hamiltonian graphs 18 chh_cc(2-1)_44-1 44 100 0 0.00 0.097 Non-disjoint unions of 19 tnn(1)_26-1 26 100 0 0.00 0.066 undirected tripartite graphs 20 tnn(2)_52-1 52 100 0 0.00 0.700 Random graphs 21 iso_r01N_s20 20 100 0 0.00 0.002 22 iso_r01N_s40 40 100 0 0.00 0.010 In order to analyze the scalability of the spectral assignment approach, we compare it with the state-of-the-art graph automorphism and isomorphism tool nauty[22, 24]. For each benchmark graph, we run nauty 100 times using different randomly generated permutations. Additionally, each GI instance is solved 10000 times to obtain more accurate runtimes. The results are shown in Fig. 6. We expect similar results for other tools such as conauto[17] or bliss[12] (for a comparison of these algorithms, see the study by [23]). While the absolute runtimes of nauty, which is implemented in C, are much lower than the runtimes of our proof-of-concept Matlab implementation, the complexity of the spectral assignment grows only slightly faster. Furthermore, the comparison shows that in particular the random graphs (21) and (22) seem to be comparably easy to solve, while the union of strongly regular graphs (16) and the Steiner triple system graph (1) seem particularly hard to solve for both nauty and spectral approaches. This is also reflected in the number of backtracking steps. The spectral assignment approach for graphs with repeated eigenvalues could be optimized by a more sophisticated assignment strategy and by combining it with other heuristics. Instead of assigning nodes depending on the node numbers as described in Algorithm 2, it might be more efficient to exploit properties of the graph to decide which node should be assigned next. This is expected to reduce the number of backtracking steps and will be the focus of our future work. Fig. 6. View largeDownload slide Comparison of the runtimes of the spectral assignment approach and nauty. Note that two different axes are used. Fig. 6. View largeDownload slide Comparison of the runtimes of the spectral assignment approach and nauty. Note that two different axes are used. 6. Conclusion In this work, we have presented eigendecomposition-based methods for solving the GI problem. The algorithms were demonstrated with the aid of several guiding examples and benchmark problems. For friendly graphs, we have proven that the problem can be cast as a linear assignment problem. The approach was then generalized to unambiguous graphs. The examples show that the assignment problem formulation results in correct solutions even for ambiguous graphs. The primary issue related to the influence of ambiguous eigenvectors is the number of automorphisms and feasible solutions of the LAP. For graphs with repeated eigenvalues, our approach relies on the repeated perturbation of the adjacency matrices and solution of linear assignment problems. By exploiting properties of eigenpolytopes, it is possible to check whether two highly symmetric graphs are isomorphic. We believe that the proposed approach can be used to efficiently find isomorphisms, to detect and break symmetries and to gain insight into the structure of highly regular graphs. Other properties of the eigenpolytopes may be exploited to minimize the number of erroneous assignments which then require backtracking. An important open question is the classification of graphs that require backtracking in the spectral approach and ones that do not. The isomorphisms for graphs that do not require backtracking can consequently be computed in polynomial time. We conjecture that graphs that requires backtracking have additional structure that makes the computations particularly challenging. In practical applications, the graphs might be contaminated by noise [1]. Instead of finding a perfect matching with zero assignment cost, the goal then is to find a permutation which minimizes a given cost function. This is also called the inexact GI problem. Future work includes investigating whether our approach can also be used for the inexact problem formulation. Since the eigenvalues of a graph depend continuously on the entries of the adjacency matrix, a slightly perturbed graph will have a similar spectrum. Thus, instead of determining whether two graphs are isomorphic, the assignment approach can potentially be generalized so that the best matching of two graphs is computed, i.e. a permutation that minimizes the Frobenius norm distance between them. We believe that the Frobenius norm will serve as a good cost function for the inexact isomorphism problem. If the noise, however, is large, the spectrum of the graph might change in such a way that it becomes impossible to compare corresponding eigenvalues and eigenvectors. Acknowledgements We would like to thank the reviewers for their helpful comments and suggestions. Footnotes 1  The symmetry of the adjacency matrix should not be confused with the aforementioned graph symmetries. 2  The eigenvectors are, without loss of generality, assumed to be normalized. 3  E.g. by ensuring that for all eigenvectors 𝟙$$^{T} v_{A}^{(k)}> 0$$ and 𝟙$$^{T} v_{B}^{(k)}> 0$$. 4  In what follows, we will sometimes use the shorter cycle notation for permutations. That is, a permutation is represented as a product of cycles, where cycles of length one are omitted. E.g. π = (1 2 3) means that 1 is assigned to 2, 2 to 3 and 3 to 1, while 4 remains unchanged. 5  This is due to the fact that 𝟙 is always an eigenvector of regular graphs, all the other eigenvectors must be perpendicular and are hence not friendly, cf. the study by [7]. 6  Note that due to the cyclic symmetry, we could assign vertex v1 to any other vertex of $$\mathcal{G}_{A}$$. 7  Assume that $$V_{A}^{(k)}$$ and $$V_{B}^{(k)}$$ are simple eigenvectors and contain the same entries, then comparing $$E_{A}^{(k)}$$ and $$E_{B}^{(k)}$$ leads to the same non-zero pattern as comparing the eigenvectors entry-wise. References Aflalo , Y. , Bronstein , A. & Kimmel , R. ( 2015 ) On convex relaxation of graph isomorphism . Proc. Natl. Acad. Sci. , 112 , 2942 -- 2947 . Google Scholar CrossRef Search ADS Arvind , V. & Torán , J. ( 2005 ) Isomorphism testing: perspective and open problems . Bull. Eur. Assoc. Theor. Comput. Sci. , 86 , 66 -- 84 . Babai , L. ( 2015 ) Graph isomorphism in quasipolynomial time. CoRR, abs/1512.03547 . Babai , L. , Grigoryev , D. Y. & Mount , D. M. ( 1982 ) Isomorphism of graphs with bounded eigenvalue multiplicity . Proceedings of the 14th Annual ACM Symposium on Theory of Computing . New York, NY, USA : ACM, pp. 310 -- 324 . Burkard , R. E. & Çela , E. ( 1999 ) Linear assignment problems and extensions . Handbook of Combinatorial Optimization (D.-Z. Du & P. M. Pardalos eds). Dordrecht : Kluwer , pp. 75 -- 149 . Google Scholar CrossRef Search ADS Chan , A. & Godsil , C. D. ( 1997 ) Symmetry and eigenvectors . Graph Symmetry: Algebraic Methods and Applications (G. Hahn & G. Sabidussi eds) . Dordrecht : Kluwer , pp. 75 -- 106 . Google Scholar CrossRef Search ADS Fiori , M. & Sapiro , G. ( 2015 ) On spectral properties for graph matching and graph isomorphism problems . Inf. Inference , 4 , 63 -- 76 . Google Scholar CrossRef Search ADS Godsil , C. D. ( 1998 ) Eigenpolytopes of distance regular graphs . Canad. J. Math. , 50 , 739 -- 755 . Google Scholar CrossRef Search ADS Gower , J. C. & Dijksterhuis , G. B. ( 2004 ) Procrustes Problems, no. 30 in Oxford statistical science series . Oxford: Oxford University Press . Google Scholar CrossRef Search ADS Hallgren , S. , Russell , A. & Ta-Shma , A. ( 2003 ) The hidden subgroup problem and quantum computation using group representations . SIAM J. Comput. , 32 , 916 -- 934 . Google Scholar CrossRef Search ADS Hopcroft , J. E. & Tarjan , R. E. ( 1972 ) Isomorphism of planar graphs . Complexity of Computer Computations (R. E. Miller & J. W. Thatcher eds). Plenum Press, pp. 131 -- 152 . Junttila , T. & Kaski , P. ( 2007 ) Engineering an efficient canonical labeling tool for large and sparse graphs . Proceedings of the Ninth Workshop on Algorithm Engineering and Experiments and the Fourth Workshop on Analytic Algorithms and Combinatorics (D. Applegate, G. S. Brodal, D. Panario & R. Sedgewick eds). Philadelphia, PA, USA : SIAM, pp. 135 -- 149 . Köbler , J. ( 2006 ) On Graph Isomorphism for Restricted Graph Classes . Logical Approaches to Computational Barriers (A. Beckmann, U. Berger, B. Löwe & J. Tucker eds). Lecture Notes in Computer Science . Berlin : Springer, pp. 241 -- 256 . Google Scholar CrossRef Search ADS Kuhn , H. W. ( 1955 ) The Hungarian Method for the assignment problem . Nav. Res. Logistics Q. , 2 , 83 -- 97 . Google Scholar CrossRef Search ADS Leighton , F. T. & Miller , G. L. ( 1979 ) Certificates for graphs with distinct eigenvalues . Original Manuscript . Leskovec , J. & Krevl , A. ( 2014 ) SNAP Datasets: Stanford large network dataset collection . http://snap.stanford.edu/data. López-Presa , J. L. , Fernández Anta , A. & Núñez Chiroque , L. ( 2011a ) Conauto-2.0: fast isomorphism testing and automorphism group computation. ArXiv e-prints . López-Presa , J. L. , Fernández Anta , A. & Núñez Chiroque , L. ( 2011b ) Graph isomorphism algorithm conauto . https://sites.google.com/site/giconauto. Lovász , L. ( 2007 ) Eigenvalues of graphs . Discussion paper. Budapest, Hungary: Eötvös Loránd University . Luks , E. M. ( 1982 ) Isomorphism of graphs of bounded valence can be tested in polynomial time . J. Comput. Syst. Sci. , 25 , 42 -- 65 . Google Scholar CrossRef Search ADS McAuley , J. & Leskovec , J. ( 2012 ) Learning to discover social circles in ego networks . Adv. Neural Inf. Process. Syst. 25 , 539 -- 547 . McKay , B. D. ( 1981 ) Practical Graph Isomorphism . Congressus Numerantium , Winnipeg, Canada : Utilitas Mathematica Pub. , 30 , pp. 45 -- 87 . McKay , B. D. & Piperno , A. ( 2014 ) Practical graph isomorphism, II . J. Symb. Comput. , 60 , 94 -- 112 . Google Scholar CrossRef Search ADS Oren , I. & Band , R. ( 2012 ) Isospectral graphs with identical nodal counts . J. Phys. A Math. Theor. , 45 , 1 -- 11 . Google Scholar CrossRef Search ADS Schönemann , P. ( 1968 ) On two-sided orthogonal Procrustes problems . Psychometrika , 33 , 19 -- 33 . Google Scholar CrossRef Search ADS PubMed Spielman , D. A. ( 1996 ) Faster isomorphism testing of strongly regular graphs . Proceedings of the 28th Annual ACM Symposium on Theory of Computing. New York, NY, USA : ACM, pp. 576 -- 584 . Spielman , D. A. ( 2009 ) Spectral graph theory (Lecture notes) . http://www.cs.yale.edu/homes/spielman. Valiayeu , V. V. ( 2011 ) Griso for regular graphs . http://sourceforge.net/projects/griso. © The Author(s) 2018. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Information and Inference: A Journal of the IMA Oxford University Press

# A spectral assignment approach for the graph isomorphism problem

, Volume Advance Article – Feb 9, 2018
18 pages

Loading next page...

/lp/ou_press/a-spectral-assignment-approach-for-the-graph-isomorphism-problem-3Uq9ihpx9o
Publisher
Oxford University Press
Copyright
© The Author(s) 2018. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.
ISSN
2049-8764
eISSN
2049-8772
D.O.I.
10.1093/imaiai/iay001
Publisher site
See Article on Publisher Site

### Abstract

Abstract In this paper, we propose algorithms for the graph isomorphism (GI) problem that are based on the eigendecompositions of the adjacency matrices. The eigenvalues of isomorphic graphs are identical. However, two graphs GA and GB can be isospectral but non-isomorphic. We first construct a GI testing algorithm for friendly graphs and then extend it to unambiguous graphs. We show that isomorphisms can be detected by solving a linear assignment problem (LAP). If the graphs possess repeated eigenvalues, which typically correspond to graph symmetries, finding isomorphisms is much harder. By repeatedly perturbing the adjacency matrices and by using properties of eigenpolytopes, it is possible to break symmetries of the graphs and iteratively assign vertices of GA to vertices of GB, provided that an admissible assignment exists. This heuristic approach can be used to construct a permutation which transforms GA into GB if the graphs are isomorphic. The methods will be illustrated with several guiding examples. 1. Introduction We consider the problem of determining whether two undirected weighted graphs are isomorphic using spectral information. Efficient algorithms for the solution of the graph isomorphism (GI) or graph matching problem are required in a wide variety of different areas such as pattern recognition, object detection, image indexing, face recognition and fingerprint analysis. Furthermore, novel applications such as the analysis of neural networks and social networks require matching of graphs with millions or even billions of vertices [1]. There exists no polynomial-time algorithm to check whether two arbitrary graphs are isomorphic. Interestingly, the GI problem is one of only a few problems for which the complexity class is unknown [13]. Although several attempts have been made to develop polynomial-time algorithms for the GI problem, its complexity is currently unknown. Recently, Babai showed that GI can be solved in quasi-polynomial time [3]. GI belongs to the larger family of isomorphism problems on algebraic structures such as groups or rings that seem to lie between P and NP-complete [2]. Another open question is whether GI can be solved efficiently using quantum computers. The GI problem can be regarded as a non-Abelian hidden subgroup problem (HSP) where the hidden subgroup is the automorphism group of the graph. An efficient solution of the HSP, which is the basis of many quantum algorithms, is only known for certain Abelian groups, whereas the general non-Abelian case remains open [10]. For several special classes of graphs, however, the GI problem is known to be solvable in polynomial time. These classes include, for instance, planar graphs [11] and graphs with bounded degree [20] or bounded eigenvalue multiplicity [4]. For an overview of isomorphism testing methods for these restricted graph classes, we refer to the study by [13]. Since the GI problem is challenging from a computational point of view, one is often forced to use different relaxations [7]. In the study by [1], the proposed GI testing approach for friendly graphs is based on convex relaxations where the set of permutation matrices is replaced by the set of doubly stochastic matrices. First, a quadratic problem is solved to find a doubly stochastic matrix that minimizes the cost function; the solution is then, in a second step, projected onto the set of permutation matrices by solving a linear assignment problem. These results are extended to a larger class of graphs in the study by [7]. In this paper, instead of a convex relaxation of the GI problem, we consider a different relaxation where the set of permutation matrices is replaced by the set of orthogonal matrices. We then construct a linear assignment problem based on the eigenvectors of the graphs. We show that for a certain class of graphs, the solution of the linear assignment problem is also the unique solution of the GI problem. For highly symmetric graphs, we propose an iterative algorithm which is based on spectral information and uses local perturbations of the adjacency matrices to break symmetries and to identify possible assignments. Our algorithm extends the applicability of existing spectral methods for the GI problem to strongly regular graphs with repeated eigenvalues. The paper is organized as follows: Section 2 contains a brief description of different formulations of the GI problem. An overview of spectral properties of graphs is presented in Section 3. Furthermore, we show how these properties can be used to find isomorphisms between graphs with simple spectrum. In Section 4, we propose a novel eigendecomposition-based algorithm to determine whether two highly symmetric graphs are isomorphic. Numerical results for a number of different benchmark problems including strongly regular graphs and isospectral, but non-isomorphic graphs are presented in Section 5. Section 6 lists open questions and possible future work. 2. The GI problem Given two weighted undirected graphs $$\mathcal{G}_{A} = (\mathcal{V}, \mathcal{E}_{A})$$ and $$\mathcal{G}_{B} = (\mathcal{V}, \mathcal{E}_{B})$$ with adjacency matrices A and B, where $$\mathcal{V} = \{ \mathcal{v}_{1}, \dots , \mathcal{v}_{n} \}$$ is the set of vertices and $$\mathcal{E}_{A}$$ and $$\mathcal{E}_{B}$$ are the sets of edges, we want to determine whether these graphs are isomorphic. Definition 2.1 Two graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ are said to be isomorphic—denoted by $$\mathcal{G}_{B} \cong \mathcal{G}_{B}$$—if one of the following equivalent conditions is satisfied: (i) There exists a permutation $$\pi \in \mathscr{S}_{n}$$ such that $$(\mathcal{v}_{i}, \mathcal{v}_{j}) \in \mathcal{E}_{A}{\; \Leftrightarrow \;} (\mathcal{v}_{\pi(i)}, \mathcal{v}_{\pi(j)}) \in \mathcal{E}_{B},$$ where $$\mathscr{S}_{n}$$ denotes the symmetric group of degree n. (ii) There exists a permutation matrix $$P \in \mathscr{P}_{n}$$ such that $$B = P^{T} A P,$$ where $$\mathscr{P}_{n}$$ is the set of all n × n permutation matrices. The relation between the permutation π and the permutation matrix P = (pij) is given by $$p_{ij} = \begin{cases}1, & \textrm{if}\ \pi(i) = j, \\ 0, & \textrm{otherwise}. \end{cases}$$ The GI problem can also be rewritten as a combinatorial optimization problem of the form \begin{align} \min_{P \in \mathscr{P}_{n}} \lVert B - P^{T} A P\rVert_{F}, \end{align} (2.1) where $$\left \lVert .\right \rVert _{F}$$ denotes the Frobenius norm. The graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ are isomorphic if and only if the minimum of the above cost function is zero. Since $${\lVert B - P^{T} A P \rVert}_{{F}}^{{2}} = {\left\lVert B \right\rVert}_{{F}}^{{2}} - 2\textrm{tr}(B^{T} P^{T} A P) + {\left\lVert A \right\rVert}_{{F}}^{{2}},$$ cost function (2.1) is minimized if the term $${tr}( B^{T} P^{T} A P )$$, which is the cost function of the NP-complete quadratic assignment problem, is maximized and vice versa. Definition 2.2 An isomorphism from a graph $$\mathcal{G}$$ to itself is called an automorphism. The set of all automorphisms of a graph forms a group under matrix multiplication, the so-called automorphism group, typically denoted by $$Aut(\mathcal{G})$$. If the graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ are isomorphic, then the number of isomorphisms from $$\mathcal{G}_{A}$$ to $$\mathcal{G}_{B}$$ is identical to the number of automorphisms of $$\mathcal{G}_{A}$$ or $$\mathcal{G}_{B}$$, respectively [27]. Definition 2.3 A graph $$\mathcal{G}$$ is called asymmetric if the automorphism group is trivial. That is, Aut($$\mathcal{G}$$) = {I} for asymmetric graphs. If the automorphism group is non-trivial, we call the graph symmetric. The automorphism groups of strongly regular graphs, for example, can be highly non-trivial. Definition 2.4 A graph is said to be regular (or weakly regular) if each vertex has the same number of neighbors and strongly regular if additionally integers α and β exist such that every pair of vertices vi and vj shares exactly α common neighbors if the vertices vi and vj are adjacent and exactly β common neighbors otherwise [26]. The Frucht graph shown in Fig. 3a, for instance, is weakly regular. An example of a strongly regular graph is the Paley graph shown in Fig. 5a. The solution of the minimization problem (2.1) is not unique if the graphs possess non-trivial symmetries; for isomorphic graphs, the number of feasible solutions corresponds to the number of isomorphisms. Whether symmetries exist or not, however, is in general difficult to determine a priori. Although symmetries typically correspond to repeated eigenvalues, the correspondence is not exact [19]. Examples of asymmetric graphs with repeated eigenvalues or symmetric graphs with simple spectrum (see also Fig. 3c) can be found in the study by [7], for example. In what follows, we will characterize graphs using spectral properties. 3. Graphs with distinct eigenvalues An isomorphism testing algorithm for graphs with n distinct eigenvalues developed by Leighton and Miller is presented in the studies by [15] and [27]. The method determines the isomorphism by breaking vertices that are not equivalent into different classes. Based on the entries of the eigenvectors, these classes are refined until either an isomorphism is found or the graphs are shown to be non-isomorphic. Another approach that utilizes spectral information is presented in the study by [1], where so-called friendly graphs (defined below) are considered. The set of permutation matrices is replaced by the set of doubly stochastic matrices. After solving the resulting quadratic program, a linear assignment problem is solved to project the doubly stochastic matrix back onto the set of permutation matrices. [1] prove that for friendly isomorphic graphs, the relaxed problem is equivalent to the original GI problem. We will use a different approach for GI testing that relies on a relaxation to the manifold of orthogonal matrices and—for friendly graphs—requires only the solution of a single linear assignment problem. An extension of this method for graphs with repeated eigenvalues will be proposed in Section 4. 3.1. The two-sided orthogonal Procrustes problem Let $$\mathscr{O}_{n} = \{ P \in{\mathbb{R}}^{n \times n} \mid P^{T} P = I\}$$ denote the set of all orthogonal matrices. Note that the set of permutation matrices $$\mathscr{P}_{n}$$ is a subset of $$\mathscr{O}_{n}$$. Provided that the matrices A and B are symmetric1—we consider only undirected graphs—the solution of the relaxed problem \begin{align} \min_{P \in \mathscr{O}_{n}}\lVert B - P^{T} A P\rVert_{F}, \end{align} (3.1) which is called two-sided orthogonal Procrustes problem[9, 25], can be computed analytically, provided that both A and B have distinct eigenvalues. This result is captured in the following theorem: Theorem 3.1 Given two symmetric matrices A and B with distinct eigenvalues, let $$A = V_{A} \Lambda _{A} {V_{A}^{T}}$$ and $$B = V_{B} \Lambda _{B} {V_{B}^{T}}$$ be the eigendecompositions, with $$\Lambda _{A} =\textrm{diag}\big (\lambda _{A}^{(1)}, \dots , \lambda _{A}^{(n)}\big )$$, $$\Lambda _{B} =\textrm{diag}\big (\lambda _{B}^{(1)}, \dots , \lambda _{B}^{(n)}\big )$$ and $$\lambda _{A}^{(1)} < \dots < \lambda _{A}^{(n)}$$ as well as $$\lambda _{B}^{(1)} < \dots < \lambda _{B}^{(n)}$$. Then the orthogonal matrix P* which minimizes (3.1) is given by $$P^{\ast} = V_{A} S {V_{B}^{T}},$$ where $$S =\textrm{diag}(\pm 1, \dots , \pm 1)$$. A proof of the above theorem can be found in the study by [25]. The eigenvalues and corresponding eigenvectors have to be sorted both in either increasing or decreasing order. Note that there are 2n different orthogonal matrices which minimize the cost function. Lemma 3.2 If the graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ are isomorphic, then one of the 2n solutions is the permutation matrix P that minimizes (2.1). Proof. Since all eigenvalues are distinct, the eigenvectors of B are—up to the signs2—permutations of the eigenvectors of A, i.e. $$V_{B} = P V_{A} \hat{S}$$, where $$P \in \mathscr{P}_{n}$$ permutes the rows and $$\hat{S} =\textrm{diag}(\pm 1, \dots , \pm 1)$$ flips the signs of the eigenvectors. If we now choose $$S = \hat{S}$$, then $$P^{\ast } = P V_{A} \hat{S}^{2} {V_{A}^{T}}$$. Using $$\hat{S}^{2} = I$$ and the orthogonality of VA, we obtain $$P^{\ast } = P \in \mathscr{P}_{n}$$. Note that we are, however, not searching over the 2n solutions. Let now c be the cost of the optimal solution of the relaxed problem, i.e. $$c = \min_{P \in \mathscr{O}_{n}} \lVert B - P^{T} A P \rVert_{F} = \lVert B - P^{*T} A P^{\ast}\rVert_{F} = \left\lVert \Lambda_{B} - \Lambda_{A} \right\rVert_{F}\!.$$ As $$\mathscr{P}_{n} \subset \mathscr{O}_{n}$$, the graphs cannot be isomorphic if c≠0. If, on the other hand, c = 0, this implies that the eigenvalues of A and B are identical and the graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ are isospectral but not necessarily isomorphic. In addition to the eigenvalues, the eigenvectors of the graphs can be used for isomorphism testing as shown in the following example. Example 3.3 An example of isospectral graphs, taken from the study by [24], is shown in Fig. 1. Setting a = 1, b = 2 and c = 3, the eigenvectors of the graphs belonging to the largest eigenvalue $$\lambda _{A}^{(6)} = \lambda _{B}^{(6)} = 5.167$$ are \begin{align*} v_{A}^{(6)} =&\, [0.380, 0.092, 0.157, 0.655, 0.477, 0.407]^{T}, \\ v_{B}^{(6)} =&\, [0.222, 0.068, 0.352, 0.575, 0.353, 0.606]^{T}. \end{align*} Since the entries of the eigenvectors are different, $$v_{B}^{(6)}$$ cannot be written as a permutation of $$v_{A}^{(6)}$$, implying that $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ are not isomorphic. △ Fig. 1. View largeDownload slide Isospectral graphs. Fig. 1. View largeDownload slide Isospectral graphs. Provided that the entries of the eigenvectors are distinct, this simple test can be used for GI testing. Here, we compared the entries of the eigenvectors corresponding to the largest eigenvalue since the largest eigenvalue of the adjacency matrix of a connected graph always has multiplicity one [19]. Furthermore, all entries of the corresponding eigenvector are strictly positive. Note that in general the normalized eigenvectors are only determined up to the sign. Thus, two comparisons might be required. If the absolute values of all entries of an eigenvector are different—[27] calls such an eigenvector helpful—this gives us a canonical labeling of the vertices and the GI problem can be solved easily. 3.2. Friendly and unambiguous graphs In the study by [1], friendly graphs are considered. In contrast to asymmetry, friendliness can be easily verified. Definition 3.4 Let 𝟙 $$\in{\mathbb{R}}^{n}$$ denote the vector of ones. A graph $$\mathcal{G}_{A}$$ with adjacency matrix A is called friendly if A has distinct eigenvalues and all eigenvectors $$v_{A}^{(k)}$$ satisfy 𝟙$$^{T} v_{A}^{(k)} \ne 0$$. As a result, it is possible to make the signs of corresponding eigenvectors consistent.3 This corresponds to finding the sign matrix S in Lemma 3.2. Thus, the permutation matrix which solves the GI problem can be computed directly. We will propose a different approach that relies on the solution of a linear assignment problem and will be generalized later on. For friendly graphs, we obtain: Lemma 3.5 Every friendly graph GA is asymmetric [1]. Proof. Let $$A = V_{A} \Lambda _{A} {V_{A}^{T}}$$ be the eigendecomposition. Assuming there exists $$P \in \mathscr{P}_{n}$$ with A = PTAP, we obtain another eigendecomposition A = (PTVA)ΛA(PTVA)T. Since the eigenvectors are determined up to the signs, VA = PTVAS, where $$S =\textrm{diag}(\pm 1, \dots , \pm 1)$$. Thus, 𝟙TVA = 𝟙TPTVAS = 𝟙TVAS. Each entry of the vector 𝟙TVA must be non-zero since the graph is friendly. Thus, the equation can only be satisfied if S = I. However, S = I implies VA = PTVA and hence P = I. As a result, the automorphism group only contains the identity matrix and the graph is asymmetric. The converse is not true, the Frucht graph shown in Fig. 3a, for instance, is asymmetric but not friendly. Furthermore, there are asymmetric graphs with repeated eigenvalues. A Venn diagram of different graph classes is shown in Fig. 2 (reproduced from the study by [7], examples of graphs in each of these categories can also be found there). Even if eigenvectors are not friendly, it is often possible to make the signs of two corresponding eigenvectors consistent. Fig. 2. View largeDownload slide Different classes of graphs based on the study by [7]. Friendly graphs are given by the intersection of graphs with simple spectrum and graphs whose eigenvectors are non-orthogonal to 𝟙. Fig. 2. View largeDownload slide Different classes of graphs based on the study by [7]. Friendly graphs are given by the intersection of graphs with simple spectrum and graphs whose eigenvectors are non-orthogonal to 𝟙. Definition 3.6 We call an eigenvector v ambiguous if v and −v have exactly the same entries. That is, there exists $$P \in \mathscr{P}_{n}$$ with v = −Pv. A graph without ambiguous eigenvectors is called unambiguous. This property can be easily verified by sorting the entries. Note that ambiguity implies unfriendliness since v = −Pv results in 𝟙$$^{T}_{v} =-$$ 𝟙$$^{T}_{v}$$ and thus 𝟙$$^{T}_{v} = 0$$. The class of graphs whose eigenvectors are not ambiguous is thus larger than the class of friendly graphs. 3.3. A spectral assignment approach for friendly and unambiguous graphs We now want to construct a linear assignment problem for friendly graphs that solves the GI problem. In our approach, the cost of assigning vertices of $$\mathcal{G}_{A}$$ to vertices of $$\mathcal{G}_{B}$$ will be based on the eigenvectors. Definition 3.7 Let $$V, W \in{\mathbb{R}}^{n \times m}$$ be two matrices, then the cost of assigning V to W is defined to be $$C(V, W) = (c_{ij}) \in{\mathbb{R}}^{n \times n}$$, with $$c_{ij} = \sum_{k=1}^{m} \left\lvert v_{i}^{(k)} - w_{j}^{(k)} \right\rvert,$$ where v(k) and w(k) are the column vectors of V and W, respectively. Lemma 3.8 Let V, W and C = C(V, W) be as in the above definition. Define \begin{align} c = \min_{P \in \mathscr{P}_{n}}{tr} ( P^{T} C ) \end{align} (3.2) to be the minimal cost of the linear assignment problem. Then V = PW for $$P \in \mathscr{P}_{n}$$ if and only if c = 0. Proof. We will first show that this result holds for vectors v and w. Assume that v = Pw for $$P \in \mathscr{P}_{n}$$. It follows that vi = wπ(i) and thus $$c_{i, \pi (i)} = \left \lvert v_{i} - w_{\pi (i)} \right \rvert = 0$$. Furthermore, $${tr} (P^{T} C) = \sum_{i,j=1}^{n} p_{ij} c_{ij} = \sum_{i=1}^{n} c_{i, \pi(i)} = 0.$$ For the other direction, assume that c = 0 and that the corresponding permutation matrix is $$\hat{P}$$. Then $$c_{i, \hat{\pi }(i)} = \lvert v_{i} - w_{\hat{\pi }(i)}\rvert = 0$$ and consequently $$v = \hat{P} w$$. For matrices, the proof is almost identical. Assume that V = PW. Then v(k) = Pw(k) for all column vectors. Thus, with the first part, we obtain ci, π(i) = 0 and $${tr} \left ( P^{T} C \right ) = 0$$. The other direction follows in the same way. The linear assignment problem can be solved in O(n3) using the Hungarian method [5, 14]. For two friendly graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ with adjacency matrices A and B and eigendecompositions as described in Theorem 3.1, we define $$C(\mathcal{G}_{A}, \mathcal{G}_{B})=C(V_{A}, V_{B})$$. Note that in the following theorem we assume that the signs of the corresponding eigenvectors $$v_{A}^{(k)}$$ and $$v_{B}^{(k)}$$ are consistent. This is possible as the sum of the entries of each eigenvector is non-zero. Theorem 3.9 Let $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$ be friendly graphs. Define $$C=C(\mathcal{G}_{A}, \mathcal{G}_{B})$$ and c as in (3.2). Then $$\mathcal{G}_{A}\cong \mathcal{G}_{B}$$ if and only if ΛA =ΛB and c = 0. The solution P of the linear assignment problem is then a solution of GI. Proof. Assume that $$\mathcal{G}_{A}\cong \mathcal{G}_{B}$$ and thus $$A = V_{A} \Lambda _{A} {V_{A}^{T}} = P B P^{T} = (P V_{B}) \Lambda _{B} (P V_{B})^{T}$$. Using Lemma 3.8, we obtain $$\textrm{tr} \left ( P^{T} C \right ) = 0$$. If, on the other hand, c = 0, Lemma 3.8 implies that VA = PVB and thus $$\lVert B - P^{T} A P \rVert_{F} = \left\lVert V_{B} \Lambda_{B} {V_{B}^{T}} - P^{T} V_{A} \Lambda_{A} {V_{A}^{T}} P\right\rVert_{F} = \left\lVert \Lambda_{B} - \Lambda_{A}\right\rVert_{F} = 0.$$ For friendly graphs, we propose the GI testing approach described in Algorithm 1. If an eigenvector is ambiguous, we will use the absolute values of the entries for the computation of the cost matrix. Algorithm 1 Graph isomorphism testing for friendly graphs. Example 3.10 Let us illustrate the definition of ambiguity:4 (i) The vectors v = [1, 2, 0, −3]T and w = [0, −1, −2, 3]T are unfriendly, but not ambiguous, and v can only be assigned to − w using π = (1 2 3). (ii) The vectors v = [1, 2, −1, −2]T and w = [−2, −1, 1, 2]T, on the other hand, are ambiguous since v can be assigned to w using π = (1 3 2 4) or to − w using π = (1 2). Taking absolute values leads to two spurious solutions, given by π = (1 3 2) and π = (1 2 4). △ If the eigenvectors are not ambiguous, we can make the signs consistent (e.g. by sorting the entries) and apply Algorithm 1 in the same way by replacing only step 2. Note that we only assumed that the signs of the eigenvectors are consistent in Theorem 3.9, the friendliness property was not used explicitly. Example 3.11 Let us consider different graph types to illustrate the idea behind the assignment approach: (i) Given the Frucht graph shown in Fig. 3a and the permutation of the graph shown in Fig. 3b, the resulting cost matrix C is displayed in Fig. 3g. The solution of the linear assignment problem is $$\pi = \left[\begin{array}{cccccccccccc} 1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 & 9 & 10 & 11 & 12 \\ 4 & 5 & 6 & 1 & 2 & 3 & 9 & 8 & 7 & 12 & 11 & 10 \end{array}\right]$$ and the cost of the assignment is zero. The graph has simple spectrum, is asymmetric, regular and thus not friendly.5 However, only one eigenvector is ambiguous. Even without taking into account this eigenvector, the algorithm successfully computes the correct permutation matrix. (ii) An example of a graph with non-trivial automorphism group but simple spectrum, taken from the study by [7], and a random permutation are shown in Fig. 3c–d, the corresponding cost matrix C is depicted in Fig. 3h. Here, two eigenvectors are ambiguous and the solution of the LAP is not unique since v1 could be assigned to v2 or v4 and v2 to v1 or v3. These assignments, however, are not independent, as soon as one of the first four vertices is assigned, the others follow automatically. Feasible solutions are $$\pi_{1} = \left[\begin{array}{cccccccc} 1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 \\ 4 & 3 & 2 & 1 & 6 & 5 & 7 & 8 \end{array}\right], \quad \pi_{2} = \left[\begin{array}{cccccccc} 1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 \\ 2 & 1 & 4 & 3 & 6 & 5 & 7 & 8 \end{array}\right].$$ If we do not use the absolute values of the ambiguous eigenvectors for the computation of the cost matrix, then there are four possible combinations: we can assign $$v_{A}^{(3)}$$ to $$\pm v_{B}^{(3)}$$ and $$v_{A}^{(6)}$$ to $$\pm v_{B}^{(6)}$$. One combination will result in π1, one in π2, the remaining two lead to non-zero assignment costs. Thus, taking absolute values prevents conflicting information about possible assignments, but also results in spurious solutions (cf. Example 3.10). (iii) The Facebook social circles graph [21], available through the SNAP website [16], consists of 4039 vertices and 88234 edges. The adjacency matrices of the original and permuted graph are shown in Fig. 3e–f. The cluster structure of the graph is clearly visible in Fig. 3e. Numerically, the graph has a couple of repeated eigenvalues (around λ = −1, λ = 0 and λ = 1) and we neglect the corresponding eigenvectors. Nevertheless, Algorithm 1 returns a valid assignment that solves the GI problem. △ Fig. 3. View largeDownload slide (a–b) Original and permuted Frucht graph. (c–d) Original and permuted house graph. (e–f) Original and permuted adjacency matrix of the Facebook graph. (g–h) Cost matrix for Frucht graph and house graph. White entries represent possible assignments with cost cij < ε. Fig. 3. View largeDownload slide (a–b) Original and permuted Frucht graph. (c–d) Original and permuted house graph. (e–f) Original and permuted adjacency matrix of the Facebook graph. (g–h) Cost matrix for Frucht graph and house graph. White entries represent possible assignments with cost cij < ε. The examples show that even with incomplete information it is possible to compute valid solutions of the GI problem. If the solution is not unique, constructing the cost matrix reduces the search space significantly since only zero-cost entries need to be taken into account. Furthermore, the examples illustrate the difficulties arising from ambiguous eigenvectors. 4. Graphs with repeated eigenvalues If the graphs possess repeated eigenvalues, finding isomorphisms is much harder. The eigenvectors belonging to repeated eigenvalues are unique only up to basis rotations, and we cannot construct a linear assignment problem by comparing corresponding eigenvectors anymore. As mentioned above, repeated eigenvalues typically correspond to graph symmetries. It can be shown that strongly regular graphs, for instance, possess at most three distinct eigenvalues [19]. Definition 4.1 Given a graph $$\mathcal{G}_{A}$$ with adjacency matrix A. Let $$\lambda _{A} = \big [ \lambda _{A}^{(1)}, \dots , \lambda _{A}^{(m)} \big ]$$ be the eigenvalues of the graph $$\mathcal{G}_{A}$$ with multiplicities $$\mu _{A} = \big [ \mu _{A}^{(1)}, \dots , \mu _{A}^{(m)} \big ]$$, i.e. $$\sum _{k=1}^{m} \mu _{A}^{(k)} = n$$. We then partition VA into $$V_{A} = \big [ V_{A}^{(1)}, \dots , V_{A}^{(m)} \big ]$$, with $$V_{A}^{(k)} \in{\mathbb{R}}^{n \times \mu _{A}^{(k)}}$$. That is, $$V_{A}^{(k)}$$ is either the eigenvector belonging to the eigenvalue $$\lambda _{A}^{(k)}$$ or the matrix whose columns form an orthogonal basis of the eigenspace. Example 4.2 We will use the following guiding examples to illustrate the proposed isomorphism testing approach for graphs with repeated eigenvalues: (i) The eigenvalues of the cycle graph shown in Fig. 4a are λA = [−2, −1, 1, 2] with multiplicities μA = [1, 2, 2, 1]. The automorphism group of the cycle graph is D6 and thus $$\left \lvert \textrm{Aut}(\mathcal{G})\right \rvert = 12$$. (ii) The eigenvalues of the strongly regular Paley graph shown in Fig. 5a are $$\lambda _{A} = \big [ \frac{-1 - \sqrt{17}}{2}, \frac{-1 + \sqrt{17}}{2}, 8 \big ]$$ with multiplicities $$\mu _{A} = \big [ 8, 8, 1 \big ]$$. Here, Aut($$\mathcal{G}$$) ≅ S5. Thus, the graph possesses $$\left \lvert \textrm{Aut}(\mathcal{G})\right \rvert = 5! = 120$$ automorphisms. △ Fig. 4. View largeDownload slide GI testing procedure for the cycle graph. The various colors represent self-loops with different weights. The bottom row shows the structure of the corresponding cost matrices C. White entries represent possible assignments with cost cij < ε. After two perturbations of the graphs, the solution of the LAP is unique. Fig. 4. View largeDownload slide GI testing procedure for the cycle graph. The various colors represent self-loops with different weights. The bottom row shows the structure of the corresponding cost matrices C. White entries represent possible assignments with cost cij < ε. After two perturbations of the graphs, the solution of the LAP is unique. Fig. 5. View largeDownload slide (a) Strongly regular Paley graph. (b–e) Structure of the cost matrices C after 1, 2, 3 and 4 successful assignments. White entries represent assignments with cij < ε. After four perturbations, the solution is unique. Fig. 5. View largeDownload slide (a) Strongly regular Paley graph. (b–e) Structure of the cost matrices C after 1, 2, 3 and 4 successful assignments. White entries represent assignments with cij < ε. After four perturbations, the solution is unique. 4.1. Eigenpolytopes For the graphs in the previous example, it is not possible to apply Algorithm 1 directly. If we compare only eigenvectors belonging to distinct eigenvalues, this leads to C = 0. That is, any permutation matrix would be a feasible solution of the linear assignment problem. Therefore, we need to exploit additional information encoded in matrices representing orthogonal projections onto the eigenspace of repeated eigenvalues. Definition 4.3 Let $$\lambda _{A}^{(k)}$$ be a repeated eigenvalue of graph $$\mathcal{G}_{A}$$. For a vertex vi, define $$V_{A}^{(k)}(\mathcal{v}_{i})$$ to be the ith row of $$V_{A}^{(k)}$$. The convex hull of all vectors $$V_{A}^{(k)}(\mathcal{v}_{i})$$, $$i = 1, \dots , n$$, is called the eigenpolytope of the graph belonging to the eigenvalue $$\lambda _{A}^{(k)}$$. The row vectors $$V_{A}^{(k)}(\mathcal{v}_{i})$$ clearly depend on the orthogonal basis chosen for the eigenspace, but the scalar product is independent of the choice of basis [8]. The matrix $$E_{A}^{(k)} = V_{A}^{(k)} \left(V_{A}^{(k)}\right)^{T},$$ i.e. $$(E_{A}^{(k)})_{ij} = \big \langle V_{A}^{(k)}(\mathcal{v}_{i}), \, V_{A}^{(k)}(\mathcal{v}_{j}) \big \rangle$$, represents the orthogonal projection onto the column space of $$V_{A}^{(k)}$$ and is an invariant of the eigenspace that does not depend on the orthogonal basis chosen for $$V_{A}^{(k)}$$, see also the study by [6]. Thus, $$E_{B}^{(k)} = P^{T} E_{A}^{(k)} P$$, which in itself can again be interpreted as a GI problem. For a detailed description of the relation between a graph and the geometry of its eigenpolytopes, we refer to the studies by [6] and [8]. We now exploit properties of the matrices $$E_{A}^{(k)}$$ to identify isomorphisms. In what follows, we will show that by comparing eigenvectors and eigenpolytopes, it is possible to compute isomorphisms of strongly regular graphs such as the Paley graph. 4.2. A spectral assignment approach for graphs with symmetries As described above, repeated eigenvalues complicate GI testing. Our heuristic approach is based on finding local perturbations of the adjacency matrices A and B that break symmetries without changing the isomorphism. Let us illustrate the basic idea with a simple example. Example 4.4 Let us consider the cycle graphs shown in Fig. 4a. In order to find an assignment for vertices of $$\mathcal{G}_{A}$$ to vertices of $$\mathcal{G}_{B}$$, we perturb the adjacency matrices A and B. If we add a self-loop to vertex v1 of $$\mathcal{G}_{A}$$ and v1 of $$\mathcal{G}_{B}$$, the two graphs remain isomorphic.6 Thus, we assign vertex v1 to vertex v1 of $$\mathcal{G}_{B}$$. The updated graphs are shown in Fig. 4b, the marked vertices denote self-loops with weight 1. Now, we try to assign vertex v2 of $$\mathcal{G}_{A}$$ to a vertex of $$\mathcal{G}_{B}$$. Since we have broken the cyclic symmetry, there are now only two possible assignments (due to the remaining reflection symmetry). Vertex v2 of $$\mathcal{G}_{A}$$ could be either assigned to vertex v5 or v6 of $$\mathcal{G}_{B}$$. Adding self-loops with weight 2 to vertex v2 of $$\mathcal{G}_{A}$$ and vertex v5 of $$\mathcal{G}_{B}$$—shown in Fig. 4c—the resulting graph is friendly and thus asymmetric. The permutation matrix P could be computed using Algorithm 1. Alternatively, the procedure described above can be repeated until a valid assignment for all vertices is found. The resulting graphs are shown in Fig. 4d. △ Let us formalize the above procedure. We start with the original adjacency matrices A and B and construct cost matrices C(k), $$k = 1, \dots , m$$, as follows. For simple eigenvectors, we use the cost matrix from Definition 3.7, i.e. $$C^{(k)} = C(V_{A}^{(k)}, V_{B}^{(k)})$$. For repeated eigenvalues, we compute the projection matrices $$E_{A}^{(k)}$$ and $$E_{B}^{(k)}\!,$$ and check for each row i of $$E_{A}^{(k)}$$ whether it can be written as a permutation of row j of $$E_{B}^{(k)}$$ by comparing the sorted vectors.7 Definition 4.5 Let $$E_{A}^{(k)}(\mathcal{v}_{i})$$ and $$E_{B}^{(k)}(\mathcal{v}_{j})$$ be the ith and jth row of the matrices $$E_{A}^{(k)}$$ and $$E_{B}^{(k)}$$, respectively, and let $$s :{\mathbb{R}}^{n} \to{\mathbb{R}}^{n}$$ be a function that sorts the entries of a vector. For repeated eigenvalues, we define $$C^{(k)} = (c_{ij}^{(k)})$$, with $$c_{ij}^{(k)} = \big \lVert s \big ( E_{A}^{(k)}(\mathcal{v}_{i}) \big ) - s \big ( E_{B}^{(k)}(\mathcal{v}_{j}) \big ) \big \rVert _{F}$$. Note that this is only a heuristic approach and might lead to wrong assignments. However, utilizing properties of the eigenpolytopes improves the efficiency of the algorithm significantly and backtracking is required only in exceptional cases. For two graphs $$\mathcal{G}_{A}$$ and $$\mathcal{G}_{B}$$, the cost matrix is then defined as $$C(\mathcal{G}_{A}, \mathcal{G}_{B}) = \sum_{k=1}^{m} C^{(k)}.$$ The entries cij represent the costs of assigning vi of $$\mathcal{G}_{A}$$ to vj of $$\mathcal{G}_{B}$$. To determine possible assignments, we again compute $$C=C(\mathcal{G}_{A}, \mathcal{G}_{B})$$ and solve the resulting linear assignment problem $$c = \min_{P \in \mathscr{P}_{n}}{tr} (P^{T} C).$$ For the unperturbed cycle graph and Paley graph, the resulting cost matrices are zero, which implies that any vertex of $$\mathcal{G}_{A}$$ can initially be assigned to any vertex of $$\mathcal{G}_{B}$$. However, these assignments cannot be chosen independently. Thus, we assign vertices iteratively using local perturbations of the graphs as described in Example 4.4. After perturbing the graphs, symmetries are destroyed and the number of non-zero entries decreases until only one feasible solution remains. In order to perturb the adjacency matrices A and B, and hence the eigenvalues and eigenvectors, we use single-entry matrices representing self-loops with different weights w. Definition 4.6 Define $$D_{i}(w) =\textrm{diag}(d_{1}, \dots , d_{n})$$ to be the diagonal matrix with $$d_{j} = \begin{cases} w, & \textrm{if } j = i, \\ 0, & \textrm{otherwise}. \end{cases}$$ The proposed method for graphs with repeated eigenvalues is described in Algorithm 2. The algorithm can be stopped if the solution of the LAP is unique. The number of iterations required to obtain a unique solution depends on the order in which the vertices are perturbed. In the description of the algorithm, we have not included backtracking techniques. Backtracking is needed if a previously found assignment does not result in a correct permutation. We then delete the previous assignment and try to find a different assignment for the current vertex. Backtracking is required only for certain graph types as illustrated in Section 5. Algorithm 2 GI testing for graphs with repeated eigenvalues Example 4.7 Let us consider again the graphs from Example 4.2: (i) For the cycle graph, the cost matrices C that result in successful assignments are shown in Fig. 4a–d. Without perturbing the adjacency matrices, each vertex of GA can be assigned to each vertex of GB and the cost matrix C is zero. After one perturbation, all eigenvalues are distinct, but due to the remaining reflection symmetry the solution is not unique and there are still two ambiguous eigenvectors (see Example 3.11). After two iterations, the solution is unique and the resulting permutation is given by $$\pi = \left[\begin{array}{cccccc} 1 & 2 & 3 & 4 & 5 & 6 \\ 1 & 5 & 3 & 4 & 2 & 6 \end{array}\right].$$ (ii) For the Paley graph, the cost matrices after 1, 2, 3 and 4 successful assignments are shown in Fig. 5b–e. After the fourth perturbation, the solution is unique and $$\pi = \left[\begin{array}{ccccccccccccccccc} 1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 & 9 & 10 & 11 & 12 & 13 & 14 & 15 & 16 & 17\\ 1 & 6 & 15 & 3 & 11 & 7 & 17 & 12 & 9 & 8 & 4 & 2 & 5 & 16 & 10 & 13 & 14\end{array}\right].$$ △ ▵ These examples demonstrate that the proposed method successfully computes isomorphisms of graphs with repeated eigenvalues. 5. Benchmark problems In this section, we will present numerical results for benchmark problems downloaded from the studies by [28] and [18]. For our computations, we used Matlab and an error tolerance ε = 10−6. That is, two eigenvalues of a matrix are defined to be identical if the difference is less than ε. Furthermore, an assignment is accepted if the cost c of the solution of the LAP is less than ε. For all graphs downloaded from the study by [28], the algorithm returned correct results without backtracking. Results for larger benchmark problems used by conauto are presented in Table 1. For each benchmark problem, we run the proposed algorithm 100 times using different randomly generated permutations of the original graph. Here, n is the number of vertices, noBT the number of runs where no backtracking was required and BT the number of runs where backtracking was required to find an isomorphism. The column steps describes the average number of backtracking steps needed to find an isomorphism and the last column lists the average runtime in seconds. The efficiency of the algorithm could be easily improved by using C or C++. The results show that the algorithm returns correct results for most of the benchmark problems without backtracking. Only the Steiner triple system graph (1) and the union of strongly regular graphs (16) require backtracking for almost all test cases. Table 1 Test results for various benchmark graphs Type # Name n noBT BT steps (avg) time [s] Steiner triple system graphs 1 sts-19_57 57 5 95 37.91 11.051 Latin square graphs 2 latin-3_9 9 100 0 0.00 0.004 (prime order) 3 latin-5_25 25 100 0 0.00 0.017 4 latin-7_49 49 98 2 1.00 0.141 Latin square graphs 5 latin-2_4 4 100 0 0.00 0.001 (prime power order) 6 latin-4_16 16 100 0 0.00 0.008 7 latin-6_36 36 74 26 2.31 0.086 Paley graphs (prime order) 8 paley-prime_13 13 89 11 1.09 0.006 9 paley-prime_29 29 100 0 0.00 0.031 Paley graphs (prime power order) 10 paley-power_9 9 100 0 0.00 0.002 11 paley-power_25 25 100 0 0.00 0.020 Lattice graphs 12 lattice(4)_16 16 100 0 0.00 0.010 13 lattice(6)_36 36 100 0 0.00 0.067 Triangular graphs 14 triangular(7)_21 21 100 0 0.00 0.013 15 triangular(10)_45 45 100 0 0.00 0.097 Unions of strongly regular graphs 16 usr(1)_29-1 29 11 89 13.49 0.392 Clique-conected cubic 17 chh_cc(1-1)_22-1 22 100 0 0.00 0.006 hypo-Hamiltonian graphs 18 chh_cc(2-1)_44-1 44 100 0 0.00 0.097 Non-disjoint unions of 19 tnn(1)_26-1 26 100 0 0.00 0.066 undirected tripartite graphs 20 tnn(2)_52-1 52 100 0 0.00 0.700 Random graphs 21 iso_r01N_s20 20 100 0 0.00 0.002 22 iso_r01N_s40 40 100 0 0.00 0.010 Type # Name n noBT BT steps (avg) time [s] Steiner triple system graphs 1 sts-19_57 57 5 95 37.91 11.051 Latin square graphs 2 latin-3_9 9 100 0 0.00 0.004 (prime order) 3 latin-5_25 25 100 0 0.00 0.017 4 latin-7_49 49 98 2 1.00 0.141 Latin square graphs 5 latin-2_4 4 100 0 0.00 0.001 (prime power order) 6 latin-4_16 16 100 0 0.00 0.008 7 latin-6_36 36 74 26 2.31 0.086 Paley graphs (prime order) 8 paley-prime_13 13 89 11 1.09 0.006 9 paley-prime_29 29 100 0 0.00 0.031 Paley graphs (prime power order) 10 paley-power_9 9 100 0 0.00 0.002 11 paley-power_25 25 100 0 0.00 0.020 Lattice graphs 12 lattice(4)_16 16 100 0 0.00 0.010 13 lattice(6)_36 36 100 0 0.00 0.067 Triangular graphs 14 triangular(7)_21 21 100 0 0.00 0.013 15 triangular(10)_45 45 100 0 0.00 0.097 Unions of strongly regular graphs 16 usr(1)_29-1 29 11 89 13.49 0.392 Clique-conected cubic 17 chh_cc(1-1)_22-1 22 100 0 0.00 0.006 hypo-Hamiltonian graphs 18 chh_cc(2-1)_44-1 44 100 0 0.00 0.097 Non-disjoint unions of 19 tnn(1)_26-1 26 100 0 0.00 0.066 undirected tripartite graphs 20 tnn(2)_52-1 52 100 0 0.00 0.700 Random graphs 21 iso_r01N_s20 20 100 0 0.00 0.002 22 iso_r01N_s40 40 100 0 0.00 0.010 Table 1 Test results for various benchmark graphs Type # Name n noBT BT steps (avg) time [s] Steiner triple system graphs 1 sts-19_57 57 5 95 37.91 11.051 Latin square graphs 2 latin-3_9 9 100 0 0.00 0.004 (prime order) 3 latin-5_25 25 100 0 0.00 0.017 4 latin-7_49 49 98 2 1.00 0.141 Latin square graphs 5 latin-2_4 4 100 0 0.00 0.001 (prime power order) 6 latin-4_16 16 100 0 0.00 0.008 7 latin-6_36 36 74 26 2.31 0.086 Paley graphs (prime order) 8 paley-prime_13 13 89 11 1.09 0.006 9 paley-prime_29 29 100 0 0.00 0.031 Paley graphs (prime power order) 10 paley-power_9 9 100 0 0.00 0.002 11 paley-power_25 25 100 0 0.00 0.020 Lattice graphs 12 lattice(4)_16 16 100 0 0.00 0.010 13 lattice(6)_36 36 100 0 0.00 0.067 Triangular graphs 14 triangular(7)_21 21 100 0 0.00 0.013 15 triangular(10)_45 45 100 0 0.00 0.097 Unions of strongly regular graphs 16 usr(1)_29-1 29 11 89 13.49 0.392 Clique-conected cubic 17 chh_cc(1-1)_22-1 22 100 0 0.00 0.006 hypo-Hamiltonian graphs 18 chh_cc(2-1)_44-1 44 100 0 0.00 0.097 Non-disjoint unions of 19 tnn(1)_26-1 26 100 0 0.00 0.066 undirected tripartite graphs 20 tnn(2)_52-1 52 100 0 0.00 0.700 Random graphs 21 iso_r01N_s20 20 100 0 0.00 0.002 22 iso_r01N_s40 40 100 0 0.00 0.010 Type # Name n noBT BT steps (avg) time [s] Steiner triple system graphs 1 sts-19_57 57 5 95 37.91 11.051 Latin square graphs 2 latin-3_9 9 100 0 0.00 0.004 (prime order) 3 latin-5_25 25 100 0 0.00 0.017 4 latin-7_49 49 98 2 1.00 0.141 Latin square graphs 5 latin-2_4 4 100 0 0.00 0.001 (prime power order) 6 latin-4_16 16 100 0 0.00 0.008 7 latin-6_36 36 74 26 2.31 0.086 Paley graphs (prime order) 8 paley-prime_13 13 89 11 1.09 0.006 9 paley-prime_29 29 100 0 0.00 0.031 Paley graphs (prime power order) 10 paley-power_9 9 100 0 0.00 0.002 11 paley-power_25 25 100 0 0.00 0.020 Lattice graphs 12 lattice(4)_16 16 100 0 0.00 0.010 13 lattice(6)_36 36 100 0 0.00 0.067 Triangular graphs 14 triangular(7)_21 21 100 0 0.00 0.013 15 triangular(10)_45 45 100 0 0.00 0.097 Unions of strongly regular graphs 16 usr(1)_29-1 29 11 89 13.49 0.392 Clique-conected cubic 17 chh_cc(1-1)_22-1 22 100 0 0.00 0.006 hypo-Hamiltonian graphs 18 chh_cc(2-1)_44-1 44 100 0 0.00 0.097 Non-disjoint unions of 19 tnn(1)_26-1 26 100 0 0.00 0.066 undirected tripartite graphs 20 tnn(2)_52-1 52 100 0 0.00 0.700 Random graphs 21 iso_r01N_s20 20 100 0 0.00 0.002 22 iso_r01N_s40 40 100 0 0.00 0.010 In order to analyze the scalability of the spectral assignment approach, we compare it with the state-of-the-art graph automorphism and isomorphism tool nauty[22, 24]. For each benchmark graph, we run nauty 100 times using different randomly generated permutations. Additionally, each GI instance is solved 10000 times to obtain more accurate runtimes. The results are shown in Fig. 6. We expect similar results for other tools such as conauto[17] or bliss[12] (for a comparison of these algorithms, see the study by [23]). While the absolute runtimes of nauty, which is implemented in C, are much lower than the runtimes of our proof-of-concept Matlab implementation, the complexity of the spectral assignment grows only slightly faster. Furthermore, the comparison shows that in particular the random graphs (21) and (22) seem to be comparably easy to solve, while the union of strongly regular graphs (16) and the Steiner triple system graph (1) seem particularly hard to solve for both nauty and spectral approaches. This is also reflected in the number of backtracking steps. The spectral assignment approach for graphs with repeated eigenvalues could be optimized by a more sophisticated assignment strategy and by combining it with other heuristics. Instead of assigning nodes depending on the node numbers as described in Algorithm 2, it might be more efficient to exploit properties of the graph to decide which node should be assigned next. This is expected to reduce the number of backtracking steps and will be the focus of our future work. Fig. 6. View largeDownload slide Comparison of the runtimes of the spectral assignment approach and nauty. Note that two different axes are used. Fig. 6. View largeDownload slide Comparison of the runtimes of the spectral assignment approach and nauty. Note that two different axes are used. 6. Conclusion In this work, we have presented eigendecomposition-based methods for solving the GI problem. The algorithms were demonstrated with the aid of several guiding examples and benchmark problems. For friendly graphs, we have proven that the problem can be cast as a linear assignment problem. The approach was then generalized to unambiguous graphs. The examples show that the assignment problem formulation results in correct solutions even for ambiguous graphs. The primary issue related to the influence of ambiguous eigenvectors is the number of automorphisms and feasible solutions of the LAP. For graphs with repeated eigenvalues, our approach relies on the repeated perturbation of the adjacency matrices and solution of linear assignment problems. By exploiting properties of eigenpolytopes, it is possible to check whether two highly symmetric graphs are isomorphic. We believe that the proposed approach can be used to efficiently find isomorphisms, to detect and break symmetries and to gain insight into the structure of highly regular graphs. Other properties of the eigenpolytopes may be exploited to minimize the number of erroneous assignments which then require backtracking. An important open question is the classification of graphs that require backtracking in the spectral approach and ones that do not. The isomorphisms for graphs that do not require backtracking can consequently be computed in polynomial time. We conjecture that graphs that requires backtracking have additional structure that makes the computations particularly challenging. In practical applications, the graphs might be contaminated by noise [1]. Instead of finding a perfect matching with zero assignment cost, the goal then is to find a permutation which minimizes a given cost function. This is also called the inexact GI problem. Future work includes investigating whether our approach can also be used for the inexact problem formulation. Since the eigenvalues of a graph depend continuously on the entries of the adjacency matrix, a slightly perturbed graph will have a similar spectrum. Thus, instead of determining whether two graphs are isomorphic, the assignment approach can potentially be generalized so that the best matching of two graphs is computed, i.e. a permutation that minimizes the Frobenius norm distance between them. We believe that the Frobenius norm will serve as a good cost function for the inexact isomorphism problem. If the noise, however, is large, the spectrum of the graph might change in such a way that it becomes impossible to compare corresponding eigenvalues and eigenvectors. Acknowledgements We would like to thank the reviewers for their helpful comments and suggestions. Footnotes 1  The symmetry of the adjacency matrix should not be confused with the aforementioned graph symmetries. 2  The eigenvectors are, without loss of generality, assumed to be normalized. 3  E.g. by ensuring that for all eigenvectors 𝟙$$^{T} v_{A}^{(k)}> 0$$ and 𝟙$$^{T} v_{B}^{(k)}> 0$$. 4  In what follows, we will sometimes use the shorter cycle notation for permutations. That is, a permutation is represented as a product of cycles, where cycles of length one are omitted. E.g. π = (1 2 3) means that 1 is assigned to 2, 2 to 3 and 3 to 1, while 4 remains unchanged. 5  This is due to the fact that 𝟙 is always an eigenvector of regular graphs, all the other eigenvectors must be perpendicular and are hence not friendly, cf. the study by [7]. 6  Note that due to the cyclic symmetry, we could assign vertex v1 to any other vertex of $$\mathcal{G}_{A}$$. 7  Assume that $$V_{A}^{(k)}$$ and $$V_{B}^{(k)}$$ are simple eigenvectors and contain the same entries, then comparing $$E_{A}^{(k)}$$ and $$E_{B}^{(k)}$$ leads to the same non-zero pattern as comparing the eigenvectors entry-wise. References Aflalo , Y. , Bronstein , A. & Kimmel , R. ( 2015 ) On convex relaxation of graph isomorphism . Proc. Natl. Acad. Sci. , 112 , 2942 -- 2947 . Google Scholar CrossRef Search ADS Arvind , V. & Torán , J. ( 2005 ) Isomorphism testing: perspective and open problems . Bull. Eur. Assoc. Theor. Comput. Sci. , 86 , 66 -- 84 . Babai , L. ( 2015 ) Graph isomorphism in quasipolynomial time. CoRR, abs/1512.03547 . Babai , L. , Grigoryev , D. Y. & Mount , D. M. ( 1982 ) Isomorphism of graphs with bounded eigenvalue multiplicity . Proceedings of the 14th Annual ACM Symposium on Theory of Computing . New York, NY, USA : ACM, pp. 310 -- 324 . Burkard , R. E. & Çela , E. ( 1999 ) Linear assignment problems and extensions . Handbook of Combinatorial Optimization (D.-Z. Du & P. M. Pardalos eds). Dordrecht : Kluwer , pp. 75 -- 149 . Google Scholar CrossRef Search ADS Chan , A. & Godsil , C. D. ( 1997 ) Symmetry and eigenvectors . Graph Symmetry: Algebraic Methods and Applications (G. Hahn & G. Sabidussi eds) . Dordrecht : Kluwer , pp. 75 -- 106 . Google Scholar CrossRef Search ADS Fiori , M. & Sapiro , G. ( 2015 ) On spectral properties for graph matching and graph isomorphism problems . Inf. Inference , 4 , 63 -- 76 . Google Scholar CrossRef Search ADS Godsil , C. D. ( 1998 ) Eigenpolytopes of distance regular graphs . Canad. J. Math. , 50 , 739 -- 755 . Google Scholar CrossRef Search ADS Gower , J. C. & Dijksterhuis , G. B. ( 2004 ) Procrustes Problems, no. 30 in Oxford statistical science series . Oxford: Oxford University Press . Google Scholar CrossRef Search ADS Hallgren , S. , Russell , A. & Ta-Shma , A. ( 2003 ) The hidden subgroup problem and quantum computation using group representations . SIAM J. Comput. , 32 , 916 -- 934 . Google Scholar CrossRef Search ADS Hopcroft , J. E. & Tarjan , R. E. ( 1972 ) Isomorphism of planar graphs . Complexity of Computer Computations (R. E. Miller & J. W. Thatcher eds). Plenum Press, pp. 131 -- 152 . Junttila , T. & Kaski , P. ( 2007 ) Engineering an efficient canonical labeling tool for large and sparse graphs . Proceedings of the Ninth Workshop on Algorithm Engineering and Experiments and the Fourth Workshop on Analytic Algorithms and Combinatorics (D. Applegate, G. S. Brodal, D. Panario & R. Sedgewick eds). Philadelphia, PA, USA : SIAM, pp. 135 -- 149 . Köbler , J. ( 2006 ) On Graph Isomorphism for Restricted Graph Classes . Logical Approaches to Computational Barriers (A. Beckmann, U. Berger, B. Löwe & J. Tucker eds). Lecture Notes in Computer Science . Berlin : Springer, pp. 241 -- 256 . Google Scholar CrossRef Search ADS Kuhn , H. W. ( 1955 ) The Hungarian Method for the assignment problem . Nav. Res. Logistics Q. , 2 , 83 -- 97 . Google Scholar CrossRef Search ADS Leighton , F. T. & Miller , G. L. ( 1979 ) Certificates for graphs with distinct eigenvalues . Original Manuscript . Leskovec , J. & Krevl , A. ( 2014 ) SNAP Datasets: Stanford large network dataset collection . http://snap.stanford.edu/data. López-Presa , J. L. , Fernández Anta , A. & Núñez Chiroque , L. ( 2011a ) Conauto-2.0: fast isomorphism testing and automorphism group computation. ArXiv e-prints . López-Presa , J. L. , Fernández Anta , A. & Núñez Chiroque , L. ( 2011b ) Graph isomorphism algorithm conauto . https://sites.google.com/site/giconauto. Lovász , L. ( 2007 ) Eigenvalues of graphs . Discussion paper. Budapest, Hungary: Eötvös Loránd University . Luks , E. M. ( 1982 ) Isomorphism of graphs of bounded valence can be tested in polynomial time . J. Comput. Syst. Sci. , 25 , 42 -- 65 . Google Scholar CrossRef Search ADS McAuley , J. & Leskovec , J. ( 2012 ) Learning to discover social circles in ego networks . Adv. Neural Inf. Process. Syst. 25 , 539 -- 547 . McKay , B. D. ( 1981 ) Practical Graph Isomorphism . Congressus Numerantium , Winnipeg, Canada : Utilitas Mathematica Pub. , 30 , pp. 45 -- 87 . McKay , B. D. & Piperno , A. ( 2014 ) Practical graph isomorphism, II . J. Symb. Comput. , 60 , 94 -- 112 . Google Scholar CrossRef Search ADS Oren , I. & Band , R. ( 2012 ) Isospectral graphs with identical nodal counts . J. Phys. A Math. Theor. , 45 , 1 -- 11 . Google Scholar CrossRef Search ADS Schönemann , P. ( 1968 ) On two-sided orthogonal Procrustes problems . Psychometrika , 33 , 19 -- 33 . Google Scholar CrossRef Search ADS PubMed Spielman , D. A. ( 1996 ) Faster isomorphism testing of strongly regular graphs . Proceedings of the 28th Annual ACM Symposium on Theory of Computing. New York, NY, USA : ACM, pp. 576 -- 584 . Spielman , D. A. ( 2009 ) Spectral graph theory (Lecture notes) . http://www.cs.yale.edu/homes/spielman. Valiayeu , V. V. ( 2011 ) Griso for regular graphs . http://sourceforge.net/projects/griso. © The Author(s) 2018. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

### Journal

Information and Inference: A Journal of the IMAOxford University Press

Published: Feb 9, 2018

## You’re reading a free preview. Subscribe to read the entire article.

### DeepDyve is your personal research library

It’s your single place to instantly
discover and read the research
that matters to you.

Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.

All for just $49/month ### Explore the DeepDyve Library ### Search Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly ### Organize Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place. ### Access Get unlimited, online access to over 18 million full-text articles from more than 15,000 scientific journals. ### Your journals are on DeepDyve Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more. All the latest content is available, no embargo periods. DeepDyve ### Freelancer DeepDyve ### Pro Price FREE$49/month
\$360/year

Save searches from
Google Scholar,
PubMed

Create lists to
organize your research

Export lists, citations

Read DeepDyve articles

Abstract access only

Unlimited access to over
18 million full-text articles

Print

20 pages / month

PDF Discount

20% off