# DTIME

In computational complexity theory, **DTIME** (or **TIME**) is the computational resource of computation time for a deterministic Turing machine. It represents the amount of time (or number of computation steps) that a "normal" physical computer would take to solve a certain computational problem using a certain algorithm. It is one of the most well-studied complexity resources, because it corresponds so closely to an important real-world resource (the amount of time it takes a computer to solve a problem).

The resource **DTIME** is used to define complexity classes, sets of all of the decision problems which can be solved using a certain amount of computation time. If a problem of input size *n* can be solved in , we have a complexity class (or ). There is no restriction on the amount of memory space used, but there may be restrictions on some other complexity resources (like alternation).

## Complexity classes in DTIME[edit]

Many important complexity classes are defined in terms of **DTIME**, containing all of the problems that can be solved in a certain amount of deterministic time. Any proper complexity function can be used to define a complexity class, but only certain classes are useful to study. In general, we desire our complexity classes to be robust against changes in the computational model, and to be closed under composition of subroutines.

DTIME satisfies the time hierarchy theorem, meaning that asymptotically larger amounts of time always create strictly larger sets of problems.

The well-known complexity class **P** comprises all of the problems which can be solved in a polynomial amount of **DTIME**. It can be defined formally as:

**P** is the smallest robust class which includes linear-time problems (AMS 2004, Lecture 2.2, pg. 20). **P** is one of the largest complexity classes considered "computationally feasible".

A much larger class using deterministic time is EXPTIME, which contains all of the problems solvable using a deterministic machine in exponential time. Formally, we have

Larger complexity classes can be defined similarly. Because of the time hierarchy theorem, these classes form a strict hierarchy; we know that , and on up.

## Machine model[edit]

The exact machine model used to define DTIME can vary without affecting the power of the resource. Results in the literature often use multitape Turing machines, particularly when discussing very small time classes. In particular, a multitape deterministic Turing machine can never provide more than a quadratic time speedup over a singletape machine.^{[1]}

Multiplicative constants in the amount of time used do not change the power of DTIME classes; a constant multiplicative speedup can always be obtained by increasing the number of states in the finite state control. In the statement of Papadimitriou,^{[2]} for a language L,

- Let . Then, for any , , where .

## Generalizations[edit]

Using a model other than a deterministic Turing machine, there are various generalizations and restrictions of DTIME. For example, if we use a nondeterministic Turing machine, we have the resource NTIME. The relationship between the expressive powers of DTIME and other computational resources are very poorly understood. One of the few known results^{[3]} is

for multitape machines. This was extended to

by Santhanam.^{[4]}

If we use an alternating Turing machine, we have the resource ATIME.

## References[edit]

**^**Papadimitriou 1994, Thrm. 2.1**^**1994, Thrm. 2.2**^**Paul Wolfgang, Nick Pippenger, Endre Szemerédi, William Trotter. On determinism versus non-determinism and related problems. 24th Annual Symposium on Foundations of Computer Science, 1983. doi:10.1109/SFCS.1983.39**^**Rahul Santhanam, On separators, segregators and time versus space, 16th Annual IEEE Conference on Computational Complexity, 2001.

- American Mathematical Society (2004). Rudich, Steven and Avi Wigderson, ed.
*Computational Complexity Theory*. American Mathematical Society and Institute for Advanced Study. ISBN 0-8218-2872-X. - Papadimitriou, Christos H. (1994).
*Computational Complexity*. Reading, Massachusetts: Addison-Wesley. ISBN 0-201-53082-1.