Forcing is a model theoretic technique used to prove that adding certain axioms to classical set theory does not produce (new) inconsistencies in the resulting theory; or that certain statements are neither provable nor disprovable from the axioms of set theory.

It was famously invented and used by Paul Cohen in 1962 for his proof of the independence of the axiom of choice and the continuum hypothesis from the axioms of Zermelo-Fraenkel.

By the completeness theorem for first-order logic (of which set theory is an instance), one can think of forcing as a procedure for transforming a deduction in a formal system to a deduction in another formal system. My doctoral thesis supervisor and I are interested in finding out what is the computational content of such a transformation.

In the book "Foundations of Constructive Mathematics", Michael Beeson devotes a chapter to forcing in which he translates the technique to a constructive setting in which no notion of a model is involved, thereby allowing to read his exposition as a procedure for transforming proofs.

His interest is in obtaining conservative extension results. More precisely, he gives a general methodology for showing a theory T to be a conservative extension of a theory S. He uses this methodology to give a new (we are talking about 1979 here) and conceptually clearer proof of Goodman's theorem which says that HA^ω+AC is a conservative extension of HA^ω (which is conservative over HA). HA is Heyting arithmetic, HA^ω is higher-order Heyting arithmetic and AC is the axiom of choice.

The method is to show that the following chain of interpretations is sound: HA^ω ---> HA^ω+AC ---> HA^ω. The first arrow is a "realisability interpretation" of HA^ω inside HA^ω+AC. The second one is a forcing translation.

I have not worked out the exact translation, because I would prefer to spend the time to work out a newer computational-content-result, namely the one of Berardi-Bezem-Coquand from "On the computational content of the axiom of choice", but I would like to approach answering a question posed by my supervisor:

**Is forcing a monad?** The question is about monads as from programming language theory, not monads from category theory, because we are interested in computational content. And hence we are expected to provide:

1. a type constructor that encloses the computation going on inside the monad

2. a unit operation which creates the initial monad environment

3. a binding operation which composes consecutive computational steps inside the monad

I think that from Beeson's exposition in Chapter 15 of the mentioned book, these components are already identified and separated.

Namely, he begins with a constructive theory T from which he creates an extension, a theory called Ta, which contains an additional constant 'a' and some axioms concerning this constant. Now the point of the forcing transformation is to associate to each formula of Ta a formula of T.

He fixes a "definable" partial order (C,<) with a minimal element 0. Definable means that both C and < are "given by a formula of T, and T proves that (C,<) is a partial order with a minimal element 0".

The type C is also called "a set of conditions".

Now he specifies what forcing is supposed to look like, separately for atomic formulas and for composite ones. To make it shorter (but see below for a Coq specification), atomic formulas are expected to satisfy monotonicity and substitution, while composite formulas are expected to satisfy the usual forcing conditions. Beeson than gives a general result for any formula transformation which satisfies these conditions, and calls it a soundness theorem: if A is provable in Ta, then "forall p, exists q, q<-p -> q 'forces' A" is provable in T.

Going back to monads, I think that the given specification determines type-theoretically a class of monad types, that the composition operation of a particular monad type handles composite formulas, while the unit operation of a particular monad type "initializes" the monad with a translation for atomic formulas.

The two operations have to be given for each formal system separately and they are given for the concrete formal system Beeson is using inside his soundness theorem. (p. 348)

Here is a type-checked Coq definition of what seems to be a possible type of a forcing monad:

Program Fixpoint forces (f:formula)(p:C)(af:formula->C->Set) {measure depth f} : Type :=

match f with | (atom P t) => monotonicity af (atom P t) (* to add substitution *)

| (all A) => forall x:nat, forall q, ext q p -> {r:C & ext r q * forces (A^x) r af}

| (imp A B) => forall q, ext q p -> forces A q af -> {r:C & ext r q * forces B r af}

| bot => Empty_set end.

Two questions come to my mind at the moment:

1. Is this methodology useful to prove any conservativity result which concerns constructive type theory? (not set theory, or Heyting arithmetic)

2. Is it useful at least as a tool for "applied formal meta-theory"?

3. Does this make any sense?