In formal languguage theory, regularity is a robust notion having many equivalent representations: regular expressions, ï¬nite state automata, MSO etc. These have made important impact on programming and veriï¬cation tools.
Algorithms working on large data(say of size n) should be both space and time efficient. Any algorithm has to see atleast the whole data once, so time required has to be atleast n(see sublinear time algorithms for exceptions).
Petri net theory has nice theorems which relate subclasses of Petri nets defined by structural conditions -- for example T-nets (also known as marked graphs) and free choice nets -- to their behavioural properties -- such as checking, given a net
Computer scientists devise randomized algorithms when they cannot find good deterministic ones. Then they try to decrease the randomness used and still try to prove that the algorithm answers correctly with high probability.
In a landmark paper, Papadimitriou introduced several syntactic subclasses of the search class TFNP (Total Function Nondeterministic Polynomial) based on proof styles that (unlike TFNP) admit complete problems.
We will discuss a lower bound (due to Noga Alon) on the rank of any real matrix in which all the diagonal entries are significantly larger (in absolute value) than all the other entries.