Yeah I understand. I think it would be too much work to add the comments to 
the actual theorems one by one, I was thinking more that it would be a post 
processing step to help explain to the model how the steps are related in 
natural language. As you say though maybe having it read the metamath book 
or something might be a better way of explaining.

I thought this was an interesting release from Google, looks like they're 
offering fine tuning of their new models and offering free credits to get 
started, I don't know whether it would stretch to training on the MM 
database as I haven't explored it. I also don't know if the MM database is 
already in it's training set.

https://cloud.google.com/vertex-ai

---------------------

Here's a little conversation snippet with Bard

*Me: do you know what the metamath database is?*

Yes, I know what the Metamath database is. It is a database of mathematical 
proofs that have been verified by a computer program. The database is 
written in a formal language called Metamath, which is designed to be 
simple and easy to understand. The Metamath database contains proofs of 
over 23,000 theorems, including many important theorems from logic, set 
theory, number theory, algebra, topology, and analysis.

The Metamath database is a valuable resource for mathematicians and 
students of mathematics. It allows them to see how mathematical theorems 
are proven, and it can help them to understand the underlying logic of 
mathematics. The database can also be used to find proofs of specific 
theorems, and it can be used to generate new proofs.

The Metamath database is a free and open-source project. You can download 
the database and the Metamath program from the Metamath website.

*Me: Is the metamath database part of your training data?*

No, the Metamath database is not part of my training data. I was trained on 
a massive dataset of text and code, but the Metamath database is not 
included in that dataset. This means that I am not able to answer questions 
about mathematical theorems or proofs.

However, I can still answer general questions about mathematics. For 
example, I can tell you what a theorem is, or I can explain the difference 
between a proof and a disproof. I can also generate mathematical 
expressions and solve simple math problems.

If you have a question about a specific mathematical theorem or proof, I 
recommend that you consult a mathematician or a mathematics textbook.

*Me: Can you generate me a metamath proof that |- ( 5 + 2 ) = 7*

Sure. Here is a Metamath proof that |- ( 5 + 2 ) = 7:

|- 0 = 0. [Axiom]
|- (x = 0) → (x + y = y). [Axiom]
|- (x = y) → (y = x). [Axiom]
|- (x = y) → (x + z = y + z). [Axiom]
|- (x = y) → (z + x = z + y). [Axiom]
|- (x = y) → (x * z = y * z). [Axiom]
|- (x = y) → (z * x = z * y). [Axiom]
|- (x < y) → (x + z < y + z). [Axiom]
|- (x < y) → (z + x < z + y). [Axiom]
|- (x < y) → (x * z < y * z). [Axiom]
|- (x < y) → (z * x < z * y). [Axiom]
|- (x > y) → (x + z > y + z). [Axiom]
|- (x > y) → (z + x > z + y). [Axiom]
|- (x > y) → (x * z > y * z). [Axiom]
|- (x > y) → (z * x > z * y). [Axiom]
|- (x <= y) → (x = y ∨ x < y). [Axiom]
|- (x >= y) → (x = y ∨ x > y). [Axiom]
|- (x = y) → (x <= y ∧ x >= y). [Axiom]
|- (x <= y ∧ y <= x) → (x = y). [Axiom]
|- (x < y ∨ y < x) → (x ≠ y). [Axiom]
|- Successor(0) = 1. [Axiom]
|- Successor(x) = Successor(Successor(y)) → x = y. [Axiom]
|- (x = 0) → Successor(x) = 1. [Axiom]
|- (x ≠ 0) → Successor(x) = Successor(Successor(y)) → x = y. [Axiom]
|- (x = 0) → Successor(Successor(x)) = 2. [Axiom]
|- (x ≠ 0) → Successor(Successor(x)) = Successor(Successor(y)) → x = y. 
[Axiom]
|- (x = 0) → Successor(Successor(Successor(x))) = 3. [Axiom]
|- (x ≠ 0) → Successor(Successor(Successor(x))) = Successor(Successor(y)) → 
x = y. [Axiom]
|- (x = 0) → Successor(Successor(Successor(Successor(x)))) = 4. [Axiom]
|- (x ≠ 0) → Successor(Successor(Successor(Successor(x)))) = 
Successor(Successor(y)) → x = y. [Axiom]
|- (x = 0) → Successor(Successor(Successor(Successor(Successor(x))))) = 5. 
[Axiom]
|- (x ≠ 0) → Successor(Successor(Successor(Successor(Successor(x))))) = 
Successor(Successor(y)) → x = y. [Axiom]
|- (x = 0) → 
Successor(Successor(Successor(Successor(Successor(Successor(x)))))) = 6. 
[Axiom]
|- (x ≠ 0) → 
Successor(Successor(Successor(Successor(Successor(Successor(x)))))) = 
Successor(Successor(y)) → x = y. [Axiom]
|- (x = 0) → 
Successor(Successor(Successor(Successor(Successor(Successor(Successor(x

On Monday, May 8, 2023 at 6:59:00 PM UTC+1 David A. Wheeler wrote:

>
>
> > On May 7, 2023, at 12:23 PM, Jon P <[email protected]> wrote:
> > Re this one thing I wondered about is whether adding english text 
> between the statements might help a language model understand what is 
> happening? It would be very repetitive so maybe it makes no sense but here 
> is an example. ...
>
> The underlying Metamath proof format *does* support comments internal to 
> the proof.
> See the Metamath book appendix E ("Metamath Language EBNF").
>
> However, in *practice* tools don't use that information to generate 
> output, nor do they
> take any steps to keep such comments. I'm not sure anyone is seriously 
> interested in processing them.
> Maybe we should, but then we'd have to think about how to handle changes.
>
> > I think maybe that makes it easier to understand what is happening and 
> gives the language model something to grip on to which training?
>
> Perhaps, but no one's trying to do that. Currently, if there's a 
> trickiness, it's documented in the high-level description of the statement 
> as a whole. If if's more sophisticated, it might be better split out as its 
> own theorem.
>
> --- David A. Wheeler
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Metamath" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/metamath/960b892c-a7ad-4332-90d7-a7d3f15ff803n%40googlegroups.com.

Reply via email to