Large Cuts with Local Algorithms on Triangle-Free Graphs

We study the problem of finding large cuts in $d$-regular triangle-free graphs. In prior work, Shearer (1992) gives a randomised algorithm that finds a cut of expected size $(1/2 + 0.177/\sqrt{d})m$, where $m$ is the number of edges. We give a simpler algorithm that does much better: it finds a cut of expected size $(1/2 + 0.28125/\sqrt{d})m$. As a corollary, this shows that in any $d$-regular triangle-free graph there exists a cut of at least this size. Our algorithm can be interpreted as a very efficient randomised distributed algorithm: each node needs to produce only one random bit, and the algorithm runs in one synchronous communication round. This work is also a case study of applying computational techniques in the design of distributed algorithms: our algorithm was designed by a computer program that searched for optimal algorithms for small values of $d$.


Introduction
We study the problem of finding large cuts in triangle-free graphs.In particular, we give a new lower bound on the size of cuts-or equivalently, bipartite subgraphs-in d-regular triangle-free graphs: any such graph has a cut containing at least a fraction (1/2 + 0.28125/ √ d) of the edges.This improves on the prior bound given by Shearer [19].Our bound is constructive: we design an efficient distributed algorithm that finds a cut of this size in expectation.
The key novelty is that the distributed algorithm itself is constructed with computational techniques.In particular, this paper introduces a new approach for automating the design of distributed graph algorithms for optimisation problems.At the same time, these techniques allow us to automatically derive novel bounds for the existence of substructures in regular graphs.
In the case of the max-cut problem, the main idea is to show that for any fixed d there exists a weighted graph N d with the following property: if the graph N d has a heavy cut, then there exists a distributed randomised algorithm that finds a large cut in any d-regular input graph, and vice versa.This way we can automate the design of distributed algorithms for the max-cut problem by solving a finite combinatorial optimisation problem with off-the-shelf tools.As we will see, once we have automatically discovered the structure of optimal algorithms for numerous small values of d, a pattern emerges that enables us to generalise the algorithm family to all values of d.Again, from the combinatorial perspective, this in turns gives a constructive lower bound for the maximum size of bipartite graphs for any d.

Random Cuts
Let G = (V, E) be a simple undirected graph.A cut is a function c : V → {a, b} that labels the nodes with symbols a and b.An edge {u, v} ∈ E is a cut edge if c(u) = c(v).We use the convention that the weight w(c) of a cut c is the fraction of edges that are cut edges; that is, the weight of the cut is normalised so that it is in the range [0, 1].See Figure 1 for an illustration.While the problem of finding a maximum cut (or a good approximation of one) is NP-hard [4,6,8,16,21], there is a very simple randomised algorithm that finds a relatively large cut: for each node v, pick c(v) ∈ {a, b} independently and uniformly at random.We the electronic journal of combinatorics 24(4) (2017), #P4.21 say that c is a uniform random cut.In a uniform random cut, each edge is a cut edge with probability 1/2.It follows that the expected weight of a uniform random cut is also 1/2.

Regular Triangle-Free Graphs
In general graphs, we cannot expect to find cuts that are much better than uniform random cuts.For example, in a complete graph on n nodes, the weight of any cut is at most However, there is a family of graphs that makes for a much more interesting case from the perspective of the max-cut problem: regular triangle-free graphs.Erdős [2] raised the problem of estimating the minimum possible size of a maximum cut in a high-girth graph, and especially the case of triangle-free graphs attracted much interest from the research community [1,17,19].
Accordingly, from now on, we assume that G is a d-regular graph for some constant d 2, and that there are no triangles (cycles of length three) in G.While focusing on regular triangle-free graphs may seem overly restrictive, our algorithm can be applied in a much more general setting; we will briefly discuss extensions in Section 3.

Shearer's Algorithm
In triangle-free graphs, it is easy to find cuts that are (in expectation) larger than uniform random cuts.Nevertheless, a uniform random cut is a good starting point.
Shearer's [19] algorithm proceeds as follows.Pick three uniform random cuts c 1 , c 2 , and c 3 .For each node v, let be the number of like-minded neighbours in c 1 .Then the output of a node v is Put otherwise, a node follows c 1 if it seems that there are many cut edges w.r.t.c 1 in its immediate neighbourhood, and it falls back to another cut c 2 otherwise.The value c 3 (v) is just used as a random tie-breaker.
Shearer [19] shows that the expected weight of cut (1) is at least in d-regular triangle-free graphs.

Our Algorithm
Shearer's algorithm can be characterised as follows: take a uniform random cut c 1 and then improve it with the help of a randomised rule described in (1).In this work, we show that we can do better with the help of a simple deterministic rule.
In our algorithm we pick one uniform random cut c 1 .Again, each node v counts the number of like-minded neighbours (v) = {v, u} ∈ E : c 1 (v) = c 1 (u) .We define the threshold Now the output of a node v is simply Here −c 1 (v) is the complement of c 1 (v), that is, −a = b and −b = a.In the algorithm each node simply changes its mind if it seems that there are too many like-minded neighbours.
It is not obvious that such a rule makes sense, or that this particular choice of τ is good.Nevertheless, we show in this work that the expected weight of cut ( 5) is at least which is larger than Shearer's bound (2), especially in low-degree graphs, and much closer to the upper bound (3).As a corollary, any d-regular triangle-free graph admits a cut of at least this size.Our algorithm can be implemented very efficiently in a distributed setting, i.e., in the LOCAL model [20]: each node only needs to produce one random bit, and the algorithm only requires one communication round.In Shearer's algorithm each node has to produce up to three random bits.
In Section 2, we outline the computer-aided procedure that we used to design the algorithm, and then present an analysis of its performance.In Section 3 we discuss how to apply the algorithm in a more general setting beyond regular triangle-free graphs.

Algorithm Design and Analysis
We begin this section with an informal overview of so-called neighbourhood graphs.The formal definitions that we use in this work are given after that.

Neighbourhood Graphs in Prior Work
In the context of distributed systems, the radius-t neighbourhood N (t, v) of a node v refers to all information that node v may gather in t communication rounds.Depending on the model of computation that we use, this may include all nodes that are within distance t from v, the edges incident to these nodes, their local inputs, and the random bits that these nodes have generated.The idea is that whatever decision node v takes, it can only depend on its radius-t neighbourhood-any distributed algorithm A that runs in t communication rounds can be interpreted as a mapping from local neighbourhoods to local outputs.A neighbourhood graph N t is a graph representation of all possible radius-t neighbourhoods that a distributed algorithm may encounter.Each node N ∈ V (N t ) of the neighbourhood graph corresponds to a possible local neighbourhood: there is at least one communication network in which some node has a local neighbourhood isomorphic to N .We have an edge {N 1 , N 2 } ∈ E(N t ) in the neighbourhood graph if there is some communication network in which nodes with local neighbourhoods N 1 and N 2 are adjacent; see Figure 2 for an example.
Neighbourhood graphs are a convenient concept in the study of graph colouring algorithms, both from the perspective of traditional algorithm design [3,7,10,11,15] and from the perspective of computational algorithm design [18].The key observation is that the following two statements are equivalent: • A is a distributed algorithm that finds a proper k-colouring in t rounds.
To see this, consider any graph G.If nodes u and v are adjacent in G, then their local views N (t, u) and N (t, v) are adjacent in N t , and by assumption A assigns a different the electronic journal of combinatorics 24(4) (2017), #P4.21 colour to N (t, u) and N (t, v).Hence distributed algorithm A finds a proper k-colouring of G. Conversely, if algorithm A finds a proper colouring in any communication network, it defines a proper k-colouring of N t .
In summary, colourings of the neighbourhood graph correspond to distributed algorithms for graph colouring, and vice versa.In general, a similar property does not hold for arbitrary graph problems.For example, there is no one-to-one correspondence between maximal independent sets of N t and distributed algorithms that find maximal independent sets [18,Section 8.5].
However, as we will see in this work, we can use neighbourhood graphs also in the context of the maximum cut problem.It turns out that we can define a weighted version of neighbourhood graphs, so that there is a one-to-one correspondence between heavy cuts in the weighted neighbourhood graph, and randomised distributed algorithms that find large cuts in expectation.

Model of Distributed Computing
Next, we formalise the model of distributed computing that is sufficient for the purposes of our algorithm.Fix the parameter d; recall that we are interested in d-regular triangle-free graphs.Let G = (V, E) be such a graph, and let c be a uniform random cut in G.The is the number of neighbours with the same random bit.Note that there are only 2d + 2 possible local neighbourhoods.
A distributed algorithm is a function A that associates an output A(N ) ∈ {a, b} with each local neighbourhood N .For any d-regular triangle-free graph G = (V, E), function A defines a randomised process that produces a random cut c as follows: 1. Pick a uniform random cut c.

For each node
We use the notation A(G) for the random cut c produced by algorithm A in graph G.In particular, we are interested in the quantity E[w(A(G))], the expected weight of cut c .
A priori, we might expect that E[w(A(G))] would depend on G.However, as we will soon see, this is not the case-it only depends on parameter d and algorithm A.

Weighted Neighbourhood Graph
A weighted digraph is a pair D = (V, w) with w : V × V → [0, ∞).Here V is the set of nodes, and w associates a non-negative weight w(x, y) 0 with each directed edge (x, y) ∈ V × V .Let c : V → {a, b} be a cut in weighted digraph D. The weight of cut c is the electronic journal of combinatorics 24(4) (2017), #P4.21 Figure 3: Weighted neighbourhood graph N for d = 3. Edge weights are denoted by line widths; missing edges have weight 0. Note that the digraph is symmetric; however, we prefer the directed representation so that we do not need special treatment for self-loops.
the total weight of all cut edges.The weighted neighbourhood graph N = (V N , w N ) is a weighted digraph defined as follows (see Figure 3 for an illustration).The set of nodes consists of all possible neighbourhoods that we may encounter in d-regular triangle-free graphs.We define the edge weights as follows: We follow the convention that n k = 0 for k < 0 and k > n.Note that the weights are symmetric, and the total weight of all edges is 1.The following lemma shows that the weight of the edge (N 1 , N 2 ) in the neighbourhood graph equals the probability of "observing" adjacent neighbourhoods of types N 1 and N 2 ; see Figure 4.Note that the probability does not depend on the choice of graph G or edge {u, v}.
Lemma 1.Let G be a d-regular triangle-free graph, and let {u, v} be an edge of G. Consider a uniform random cut c of G. Then for any given neighbourhoods Proof.In what follows, we will denote the neighbours of u by u 1 , u 2 , . . ., u d where u d = v.
In particular, the random variables c(x) for x ∈ S u ∪ S v are independent.

Cuts in Neighbourhood Graphs
Any function A : V N → {a, b} can be interpreted in two ways: 1.A cut of weight w N (A) in the weighted neighbourhood graph N .
2. A distributed algorithm that finds a cut in any d-regular triangle-free graph: the algorithm picks a uniform random cut c, and then node v outputs A(N c (v)).
The following lemma shows that the two interpretations are closely related: if A is a cut of weight w in neighbourhood graph N , then it immediately gives us a distributed algorithm that finds a cut of expected weight w in any d-regular triangle-free graph.
Proof.Fix a graph G and an edge {u, v} of G.By Lemma 1 we have The claim follows by summing over all edges {u, v} of G.

Computational Algorithm Design
Now we have all the tools that we need.Lemma 2 gives a one-to-one correspondence between large cuts of the neighbourhood graph and distributed algorithms that find large cuts.For any fixed value of d, the task of designing a distributed algorithm is now straightforward: 1. Construct the weighted neighbourhood graph N .2. Find a heavy cut in N .
See Figure 5 for an example.For d = 3, the heaviest cut A opt of N is This is also the best possible algorithm for this value of d, for the model of computing that we defined in Section 2.2.
Remark 1.The reader may want to compare (7) with Section 1.5.For d = 3, the algorithms are identical, albeit with a slightly different notation.Note that τ 3 = 3.
(a, 3) (a, 2) (a, 1) (a, 0) Of course finding a maximum-weight cut is hard in the general case.However, in this particular case neighbourhood graphs are relatively small (only 2d + 2 nodes).
While the smallest cases could be easily solved with brute force, slightly more refined approaches are helpful for moderate values of d.We took the following approach.First, we reduced the max-weight-cut instance N to a max-weight-SAT instance φ in a straightforward manner: • For each node u ∈ V N we have a Boolean variable x u in formula φ.
• For each edge (u, v) of weight w N (u, v) we have two clauses in formula φ, both of weight w N (u, v): x u ∨ x v and ¬x u ∨ ¬x v Note that at least one of these clauses is always satisfied, while both of them are satisfied if and only if x u and x v have different values.
Now it is easy to see that a variable assignment x of φ that maximises the total weight of satisfied clauses also gives a maximum-weight cut A in N : let A(u) = a iff x u is true.More precisely, the total weight of the clauses satisfied by x is W + w N (A), where W is the total weight of all edges.
With this reduction, we can then resort to off-the-self max-weight-SAT solvers.In our experiments we used akmaxsat solver [9]; with it we can solve the cases d = 2, 3, . . ., 32 very quickly (e.g., the case d = 32 on a low-end laptop in less than 5 seconds).
Surprisingly, in all cases the max-weight cut has the following simple structure: The exact values of τ for the heaviest cuts are given in

Generalisation
Now it is easy to generalise the findings: we can make the educated guess that algorithms of form (8) are good also in the case of a general d.All we need to do is to find a general expression for the threshold τ , and prove that algorithm A τ indeed works well in the general case.
To facilitate algorithm analysis, let us define the shorthand notation for the performance of algorithm A τ .It is easy to see that α(0, d) = α(d + 1, d) = 1/2, as the threshold value of τ = d + 1 simply means that algorithm A τ outputs a uniform random cut, while τ = 0 means that A τ outputs the complement of the uniform random cut.The general shape of α(τ, d) is illustrated in Figure 6.We are interested in the region τ > d/2, where α(τ, d) 1/2.In the following, we derive a relatively simple expression for α(τ, d) in this region-the proof strategy is inspired by Shearer [19].Lemma 3.For all d and τ > d/2 we have the electronic journal of combinatorics 24(4) (2017), #P4.21The first part is easily solved with a simple Python script or with a short calculation in Mathematica (see Figure 8 for examples of the results for d = 2, 3, . . ., 50).We will now focus on the second part; for that we will need various estimates of binomial coefficients.The proof given here is certainly not the most elegant way to derive the bound, but it is self-contained and gets the job done.Proving the claim for a "sufficiently large" d would be straightforward.However, we need to show that already a concrete relatively small d such as d > 3000 is enough.
We will first approximate binomial coefficients with the normal distribution.Let J = {1, 2, 3, 4}, and define δ j (n) = j n/32 , g j = e −j 2 /32 for each j ∈ {0} ∪ J. Lemma 6.For any j ∈ J, δ = δ j (n), and n 1500 we have Proof.We can estimate where Now h j (δ) → g j as δ → ∞.For each j ∈ J we can verify that h j (δ) > 0.995 • g j when δ δ j (1500).have the same guarantee that each original edge is a cut edge with probability α(τ, d).
The running time of the algorithm is still one communication round; however, some nodes need to produce more random bits.
2. Our algorithm can also be applied in any graph, even in those that contain triangles.Now our analysis shows that each edge that is not part of a triangle will be a cut edge with probability α(τ, d).This observation already gives a simple bound: if at most a fraction of all edges are part of a triangle, we will find a cut of expected size at least (1 − ) • α(τ, d).
In this work we have studied one-round algorithms, and the obvious question for future work is the analysis of algorithms with multiple communication rounds.While the gap between the bounds (3) and ( 6) does not leave that much room for improvements, we conjecture that it could be further narrowed down by iterated applications of threshold algorithms.

Figure 2 :
Figure 2: In this example, we study the family F of 3-regular triangle-free graphs that are labelled with two colours, black and white.(a) A small part of neighbourhood graph N t for t = 1.(b) There exists a graph G ∈ F in which local neighbourhoods N 1 and N 2 are adjacent; hence nodes N 1 and N 2 are adjacent in the neighbourhood graph.

Figure 5 :
Figure 5: Maximum-weight cut in the weighted neighbourhood graph for d = 3.

Fact 5 .
For any n 1500 we have

Table 1 :
Optimal threshold τ opt for small values of d.