Mainly a discussion post. **A TL;DR is provided at the end of each subsection of text, but please read through properly before commenting.**

**Breakdown of classic definition of 'googology':** We all (well, most of us) know what googology *is*, as a hobby. To quote the main page: "*Googology is the mathematical study of large numbers, their properties, and their nomenclature.*"

However, I don't think this is accurate, from either a hobbyist's or mathematician's perspective. In particular, large numbers don't have any natural noteworthy properties (known or theorised, afaik), aside from being above certain boundaries, and these are mostly pre-tetration.

A natural nomenclature (system of naming, preferably scientifically) is possibly closer to hand, but this would necessarily be (almost) wholly systematic, and not just an assorted spree of individual numbers, but instead a rich network of many suffixes (which, unfortunately, is rarely pursued and ignored in favour of the individual numbers that get defined). Ideally, such a system would require justification, but that's not something that a hobbyist (or, arguably, a mathematician) should have to waste their time on.

**TL;DR: The definition of googology is closer to "the systematic creation of large numbers" than the longer definition on the main page.**

**Semantics vs syntax start:** So, what *is* googology, exactly? If we consider what fast-growing functions are most accepted by googologists as being googological or worthy of praise, then that might give us a better idea. The functions that are most often used are typically (but not always) computable, and they are typically expressions that allow a vast array of different structures and entries, instead of using some mathematical trick.

This gives us a good idea of what most of googology is about: in googology, we study "syntactic functions", not "semantic functions". To explain these terms (which are informal and created in the last minute), we need to understand the meanings of syntax and semantics. *Syntax* is the construction of statements in some language. *Semantics* is the meaning of some statement in a language.

**Grammars introduced:** To explain the point a bit better, I'll introduce the idea of a formal grammar. A formal grammar (hereafter called a grammar) is a system of rules for making strings. An example is the set of rules for creating a formula in the language of set theory: \(s \to f; \quad f \to t \in t | (f \to f) | (f \land f) | (f \lor f) | \lnot f | \forall v(f) | \exists v(f); \quad t \to v; \quad v \to X | v'\). Here, \(f\) are the formulae, \(t\) are the terms (sets), \(v\) are the variables (which here are \(X\), \(X'\), \(X' '\)...), and \(s\) is the starting string.

To create a string with the grammar, all we need to do is start with the string \(s\), and then apply any of the replacement rules. As an example, we can construct the formula \(\forall X(X \in X\) as \(s \to f \to \forall v(f) \to \forall X(f) \to \forall X(t \in t) \to \forall X(v \in t) \to \cdots \to \forall X(X \in X)\). Now, syntax says that \(\forall X(X \in X)\) is a valid string, but that although \(\exists X(\forall X(X \in X))\) is validly constructed, it isn't a valid string in the language. Semantics says what \(\forall X(X \in X)\) means, and that it is, in fact, false.

(The above grammar is a context-free grammar (as the strings on the left of each arrow were all single characters), and context free grammars are in general quite weak. Recursive grammars can stimulate any Turing machine, in addition to being able to define a wider range of languages.)

**Googology normally syntax:** Therefore, we could say that a syntactic function is a function whose evaluation is based on the syntax of an expression, and a semantic function is a function whose evaluation is based on the truth value of an expression.

Under these definitions, googology primarily concerns itself with syntactic functions that generate numbers, not semantic functions that do the same job. It should be noted that all syntactic functions yet to be defined (as far as I know*) are provably recursive in second order arithmetic, as all uncomputable functions are semantic as they - at the least - require deciding the halting problem, and all computable functions that pass this mark require diagonalising over proofs that the number is in fact a (unique, recursively determined, and actually a number) number in some theory, or use some semantic property such as 'the least x such that...' or 'the number of moves it takes this game to end'.

*This may be incorrect, but would require a correctly-translated version of Taranovsky's ordinal system, and the corresponding (and I believe unsupported) claim that it interprets second order arithmetic to be valid, so I am somewhat sure in this. Mainly due to that if his work held water, it would likely have been referenced by Rathjen or someone else in the field by now.

**Technicalities of potential exception:** It is possible to argue on technicalities that Loader's number is an exception, due to it being based on a (purely syntactic) type theory, and determining that an expression is a (unique recursively determined) number is as simple as checking the typing that makes up the number to be valid, however bijecting the theory-naturals and actual-naturals is still required, as is the subsequent summing (or in other cases comparison) of actual-naturals (which 'should' be done pre-bijection to be captured as a syntaxic property), so this is at best a gray area.

**Summary and intentionality:** This, with a formal enough definition of syntax, may be enough to formally distinguish the 'syntactic' googological functions that we all know, love, and work with the most from other functions. In particular, googological functions should in general be considered intentionally, not extensionally, as all recursive functions (presumably) have a syntactic definition. Even in practice we don't consider \(n\uparrow\uparrow n\) and \(g_{\varepsilon_0}(n)\) as the same function in general, even though they produce the same output for any given input.

**More exceptions:** It should however be noted that there are still multiple exceptions to this, and that this definition could easily be regarded as too strict. For instance, this definition means that all googologically defined numbers are integers, and that SSCG and possibly TREE are not syntacticly googological.

**TL;DR: Most googological functions can be and should be interpreted in a purely syntactic manner, and should be viewed intentionally, not extensionally.**

**Creating a general system for googology**: So, okay, that's a rough (and hopefully correct and non-super-controversial) classification of the majority of googological functions. How exactly do we formalise this?

It is impossible to create a recursive system that can create exactly the computable functions, as the computable functions are not recursively enumerable. One step further, the system can only create functions of bounded growth rate, as we could diagonalise over the functions the system creates to create a yet faster growing recursive function.

Furthermore, it seems unlikely that there is a presentation of arbitrary recursive functions (that doesn't rely on supplying recursive wellorders directly) that has no constructive translation from a general recursive presentation (i.e. the presentation representing all recursive functions is equivalent to LoEM in an intuitionistic theory).

**Where the type theory starts:** This however seems do-able in a system with bounded strength. For instance, it is completely unclear how one would create a function and typing deriviation of type \((\forall \alpha.(\alpha \to \alpha) \to \alpha \to \alpha) \to \forall \alpha.(\alpha \to \alpha) \to \alpha \to \alpha\)* in \(\lambda2\) that outgrows all functions provably recursive by \(\pi^1_7-\mathbf{CA}\), however this is relatively easy in \(\lambda\mu\) (and may not be too hard in \(\lambda\omega\)).

*Functions from church numerals (the typical representation of the naturals, here polymorphically typed) to church numerals.

Personally, it feels like the natural setting for such a theory is Curry-style type theory (the introduction I used is here, the Curry systems are the relevant ones to this post) or, more specifically, a theory of type assignment with unit, sum, product, recursive and first-order function types, where all objects that aren't functions are built from unit, sum and product. Hence all objects are isomorphic to a unique binary tree, by letting the units be leaves, products be nodes with two children, and direct sums be nodes with either a left-child or a right-child.

I think this because the type constructors can be used to create analogues of inductive types, which allows us to create types for fairly complex families of structures. Furthermore, the use of binary trees in this way allows us simple comparison of complex objects, which is very helpful for extensions which add abstractors with nontrivial behaviour that preserve strong normalisation (every term has a normal form, equivalent to totality) and confluence (there is only one normal form (given SN as above)), both arguably necessary properties for what we want.

However I guess most people here are unfamiliar with type theory, so if I axiomise something like that (which I am sorely tempted to do), I'll leave it in another post.

**One last thing: ** Type Theory, I think, is greatly underrepresented in googological settings. A type theory is often a natural language which can (often only) express recursive functions that are always total, but allows this to be done for functions on great strength. There are even type theories where the functions \(\omega\to\omega\) that are representable are precisely the recursive functions provably so in ZFC + 'there are at least n inaccessible cardinals' (for some n); this type theory is used in the automated theorem prover Coq.

There are much simpler examples, however. In this paper, SS5.3, Goedel's \(\mathcal T\), the type theory that axiomises primitive recursion on higher-order functions, is considered and found to have strength \(\varepsilon_0\). SS5.4 uses Bar Recursion, and is much stronger. Reading the first of these, if you have a basic idea of (simply typed) lambda calculus (hint: see wikipedia), will probably be well worth your time.

**TL;DR: Type theory is really underrated by googologists and you should probably go check it out.**

Umm, think that's it. Byee?

**Edit:** Okay, that was waay too rambly. Oh well. Also, you don't have to agree with me :).