**Warning:** You are not logged in. Your IP address will be publicly visible if you make any edits. If you **log in** or **create an account**, your edits will be attributed to your username, along with other benefits.

The edit can be undone.
Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.

Latest revision |
Your text |

Line 7: |
Line 7: |

| The ordering thus defined is reflexive, since every vertex is reachable from itself. It is antisymmetric, because if <math>u \leq v</math> and <math>v \leq u</math>, and <math>u \neq v</math>, then we have a cycle that contains <math>a</math> and <math>b</math>, a contradiction. And it is transitive because reachability is transitive; if there are paths from <math>u</math> to <math>v</math> and from <math>v</math> to <math>w</math> then we can concatenate them to obtain a path from <math>u</math> to <math>w</math>. So the ordering we have defined is a partial ordering. | | The ordering thus defined is reflexive, since every vertex is reachable from itself. It is antisymmetric, because if <math>u \leq v</math> and <math>v \leq u</math>, and <math>u \neq v</math>, then we have a cycle that contains <math>a</math> and <math>b</math>, a contradiction. And it is transitive because reachability is transitive; if there are paths from <math>u</math> to <math>v</math> and from <math>v</math> to <math>w</math> then we can concatenate them to obtain a path from <math>u</math> to <math>w</math>. So the ordering we have defined is a partial ordering. |

| | | |

− | Note that whereas a given DAG will correspond to exactly one partial ordering (up to isomorphism) using this convention, there may be more than one DAG that corresponds to a given partial ordering. The DAG with the ~~most ~~edges that corresponds to a given partial ordering is known as the [[transitive closure]], whereas the one with the fewest ~~edges is known as the [[transitive reduction]].~~ | + | Note that whereas a given DAG will correspond to exactly one partial ordering (up to isomorphism) using this convention, there may be more than one DAG that corresponds to a given partial ordering. The DAG with the greatest number of edges that corresponds to a given partial ordering is known as the [[transitive closure]], whereas the one with the fewest number of edges is known as the [[transitive reduction]]. |

− | | + | |

− | ~~==Properties==~~
| + | |

− | ~~A '''source''' is a node with zero in-degree; all edges point outward. Likewise, a '''sink''' is a node with zero out-degree. Every finite DAG has at least one source and one sink. To see this, choose any node. If it is a sink we are done; otherwise select any outgoing edge and follow it to another node. Repeating this process, we will never repeat a node, because the graph has no cycles, so after <math>V-1</math> iterations (where <math>V</math> is the ~~number of ~~nodes in the graph), this process must terminate. The proof that there must be at least one source is analogous.~~
| + | |

− | | + | |

− | ~~In fact, a stronger statement holds: a directed graph may be ''[[Topological sort|topologically ordered]]'' if and only if it is acyclic (a DAG).~~
| + | |

− | | + | |

− | ~~Because a DAG has no cycles, each vertex in a DAG forms a separate strongly connected component.~~
| + | |

− | | + | |

− | ~~A DAG with <math>V</math> vertices cannot have more than <math>V(V-1)/2</math> ~~edges~~, and this bound ~~is ~~attained by taking a complete undirected graph on <math>V</math> vertices, assigning a different number to each vertex, and orienting all edges so that they point from the lower-numbered vertex to the higher-numbered vertex. Such a DAG represents a total order. The proof is not hard and is left ~~as ~~an exercise to ~~the ~~reader.~~
| + | |

− | | + | |

− | ~~==Problems==~~
| + | |

− | ~~The following problems either involve DAGs specifically or have simpler/faster solutions for DAGs than for other graphs.~~
| + | |

− | ~~* The '''~~[[~~topological sort]]ing''' problem mentioned in the preceding section.~~
| + | |

− | ~~* '''[[Shortest paths]]''' and '''transitive closure'''~~
| + | |

− | ~~:* '''Single-source shortest paths''' can be computed in <math>O(E+V)</math> time by topologically sorting first and then using [[dynamic programming]].~~
| + | |

− | ~~:* '''[[All-pairs shortest paths]]''' can be computed in <math>O(V(E+V))</math> by <math>V</math> invocations of the single-source algorithm. This is, however, no faster than [[breadth-first search]] when the DAG is unweighted, and no faster than the [[Floyd–Warshall algorithm]] when the graph is dense (so that it is an improvement only over sparse weighted DAGs).~~
| + | |

− | ~~::* The same applies to the '''transitive closure problem''' – it is not easy to improve over the <math>O(V^3)</math> bound given by Warshall's algorithm, but see [[Directed acyclic graph/Transitive closure|this]] for a method that may perform better under certain circumstances. The bounds on the '''~~transitive reduction ~~problem''' match those on transitive closure.~~
| + | |

− | ~~* '''Longest path problem''': This can also be solved efficiently using dynamic programming. Note that we only need to consider paths that start at a source.~~
| + | |

− | ~~* Identifying all strongly connected components in a digraph and contracting each component to a single vertex produces a DAG, known as the '''kernel DAG'''. Adapting the special transitive closure algorithm on DAGs to the kernel DAG gives an algorithm which may be fast for some classes of digraphs. The kernel DAG is also useful as an intermediate structure in the solutions of various other problems.~~
| + | |

− | ~~* The '''[[minimum spanning arborescence~~]] ~~problem''' does not have much to do with DAGs in general, but an arborescence is a specific kind of DAG that looks like a tree~~.
| + | |