I am learning computational complexity and this is a question of my assignment that I have issues trying to solve/understand.
An oracle Turing Machine M with oracle A is a Turing Machine with an additional query tape and three special states, say Qquery, Qyes and Qno. Whenever M enters the state Qquery, in one step it moves into state Qyes or Qno, depending on whether or not the string on the query tape is in the set A; it also empties the query tape.
A set A is self-reducible if there is a deterministic polynomial time oracle machine M such that the following holds:
- A = L(M;A), where L(M;A) is the language decided by machine M with an oracle for the set A, and
- machine M -with inputs of length n- queries the oracle about strings of length at most n-1 only.
Prove that the set including all Hamiltonian graphs is self-reducible.