Talk:Proof: Difference between revisions

m
Line 97:
"because the user does not understand what is happening".
::::::: As for the last sentence, that cannot be a problem in and of itself.
::::::: More specifically, when dealing with a concept like "Natural Numbers", a computer cannot represent them in their entirety. Instead, we have to accept some symbol which represents them. And, thus, when we deal with a "proof about Natural Numbers" we are really dealing not with natural numbers but with a surrogate. The surrogate can have only a subset of all properties of natural numbers. A question then, in the context of a proof is: does the mechanical implementation represent all relevant properties of natural numbers in the surrogate? In the context of "even" and "odd" all of the relevant properties are present in single digit decimal numbers and their sums (and by this, I mean that you are not going to be able to demonstrate any relevant exceptions, without going outside the bounds of this task).
::::::: Thus, your point seems to be that you want the task to be expressed in a system which expresses the relevant rules using a particular system of symbols, and you want the task to demonstrate that the language uses the "proper sorts of manipulation" of those symbols. But that has problems: (a) the task does not really define what is "proper", and (b) as near as I can tell, a person would have to study the language implementation to determine whether treatment of [for example] induction with excluded middle is handled adequately. --[[User:Rdm|Rdm]]
6,962

edits