
Algorithm for effective decidability
Posted:
Sep 29, 2017 4:47 PM


Assume 's' is in set of numbers S, and consider the algorithm which, for input 'n', effectively tests whether 'n' is in S. If it is, algorithm gets a 'yes' and ouputs 'n', otherwise outputs 's'. My problem understanding this is that obviously if 'n' is not in S, then the algorithm will churn forever, so it's not clear under what circumstance it would ouput s. Does every such algorithm contain an internal limit on how long it can continue, as it would require under actual computation? Or in theoretical treatment it can, indeed run forever?
All clarifications appreciated, am

