If networks of chemical reactions are the circuits of biology then catalysts are the transistors, or perhaps switches. But which species should be called catalysts? Come to the talk and find out.
Localization is a fundamental problem in robotics. The kidnapped robot possesses a compass and map of its environment it must determine its location at a minimum cost of travel distance.
In formal languguage theory, regularity is a robust notion having many equivalent representations: regular expressions, ï¬nite state automata, MSO etc. These have made important impact on programming and veriï¬cation tools.
Algorithms working on large data(say of size n) should be both space and time efficient. Any algorithm has to see atleast the whole data once, so time required has to be atleast n(see sublinear time algorithms for exceptions).
Petri net theory has nice theorems which relate subclasses of Petri nets defined by structural conditions -- for example T-nets (also known as marked graphs) and free choice nets -- to their behavioural properties -- such as checking, given a net
Computer scientists devise randomized algorithms when they cannot find good deterministic ones. Then they try to decrease the randomness used and still try to prove that the algorithm answers correctly with high probability.
In a landmark paper, Papadimitriou introduced several syntactic subclasses of the search class TFNP (Total Function Nondeterministic Polynomial) based on proof styles that (unlike TFNP) admit complete problems.