Blog

Welcome to my blog!
The posts will mostly be pieces about mathematics, which I find interesting.

Canonicity of linear independence

Introduction

Independence relations play a big role in model theory. An independence relation on some mathematical structure $M$ expresses what subsets are supposed to containg information about each other. This takes the following form, for subsets $A, B, C \subseteq M$ we write $$A \indd_C^M B$$ if "$A$ is independent from $B$ over $C$". We should think of this as "all the information that $B$ has about $A$ is already contained in $C$". This really extends ideas like linear independence in vector spaces, as we will see a bit further down.$\DeclareMathOperator{\span}{span}\DeclareMathOperator{\tp}{tp}\newcommand{\lin}{{\text{lin}}}\newcommand{\N}{\mathbb{N}}\newcommand{\R}{\mathbb{R}}\newcommand{\MM}{\mathfrak{M}}$

One of the cool results is that there can really only be one good independence relation in any (elementary) class of structures! In this post we will explore that theorem in the specific case of vector spaces. That is, we will prove the following statement.

There is only one good independence relation for vector spaces, and that is linear independence.

We will make this into a precise theorem below, see Theorem 2.

Even though this comes from a very deep part of model theory, the proof for this specific case of vector spaces does actually not require any knowledge of model theory! It only requires some basic understanding of linear algebra, namely of linear independence and bases for vector spaces.

In this post we fix some field $k$ and work with $k$-vector spaces. That is, whenever we say "vector space" we really mean $k$-vector space. If we want to be more concrete we can take $k = \R$ and just work with real vector spaces.

Linear independence

We start by formulating linear independence as an independence relation on triples of subobjects. For sets $A$ and $B$ we will write $AB$ for $A \cup B$. This is common practice in model theory and it cleans up a lot of notation.

Definition. Let $A, B, C$ be subsets of some vector space $V$. We write $$A \indd_C^{\lin, V} B$$ if the following equivalent conditions hold:

  1. $\span(AC) \cap \span(BC) = \span(C)$;
  2. let $C_0$ be a base for $\span(C)$ and let $A_0$ and $B_0$ be such that $C_0 A_0$ is a base for $\span(AC)$ and $C_0 B_0$ is a base for $\span(BC)$, then $A_0 B_0 C_0$ is a base for $\span(ABC)$.

We can compare this to the well-known notion of a linearly independent set: a set $A \subseteq V$ is linearly independent iff $a \ind_\emptyset^{\lin, V} A - \{a\}$ for all $a \in A$.

The fact that the above conditions are equivalent is just basic linear algebra, and is left as an exercise to the reader.

The following is immediate from the definition of $\ind^\lin$.

Observation. Let $f: V \to W$ be an injective linear map then for any $A, B, C \subseteq V$ we have $A \ind_C^{\lin, V} B$ iff $f(A) \ind_{f(C)}^{\lin, W} f(B)$.

We call this property invariance, and we will define an abstract independence relation as an invariant ternary relation on subsets.

Definition. An independence relation is a ternary relation $\ind$ on subsets of vector spaces that is invariant. That is, if subsets $A, B, C \subseteq V$ are in the relation we write $A \ind_C^V B$. The invariance property means that for any injective linear map $f: V \to W$ and any $A, B, C \subseteq V$ we have $A \ind_C^V B$ iff $f(A) \ind_{f(C)}^W f(B)$.

The monster and automorphisms

Due to the invariance property we might as well just consider a very big vector space. For example, we can consider independence in $\R^n$ for each $n$ separately. However, they all embed into $\bigoplus_{n \in \N} \R$, and since independence is invariant we might as well work in that bigger space.

This brings us to the model-theoretic idea of a monster model. That is a model, in our case a vector space, that is so big that anything that we want to consider is already in there. Depending on how comfortable we are with working with different cardinals, we can pick one of the followig approaches to make this idea of a monster model precise.

  1. We only consider subsets that are at most countable and we work in an uncountably infinite dimensional vector space.
  2. At the start of any proof we fix some cardinal $\kappa$ that is bigger than the cardinality of any of the subsets that appear in the proof and we consider a $\kappa$-dimensional vector space (of course, this requires the subsets we construct to not depend on the size of our vector space, but that will always be the case).

From now on we work in a monster model $\MM$, which is made precise by picking one of the two options above. One of the nice things about working in just one very big monster where everything we consider is supposed to live, is that we do not have to refer to it every time. So we will drop it from the notation. For example, rather than saying "let $A, B, C \subseteq \MM$ be subsets" we will say "let $A, B, C$ be sets" and it is then understood that we mean subsets of the monster $\MM$. Similarly, we will drop it from the notation for independence relations, so we will write $A \ind_C B$ rather than $A \ind_C^\MM B$.

Definition. Let $\bar{a}$ and $\bar{b}$ be tuples of the same length and let $C$ be any set. We write $\bar{a} \equiv_C \bar{b}$ if there is an automorphism $f: \MM \to \MM$ of our monster model such that $f(\bar{a}) = \bar{b}$ and $f$ fixes $C$ pointwise (so $f(c) = c$ for all $c \in C$).

Model theorists will recognise the above notion as having the same type. Interestingly we will have no use for the classical definition of a type in terms of logical formulas.

The following lemma will be useful for constructing automorphisms.

Lemma (automorphism lemma). Let $A$ and $A'$ be sets of linearly independent vectors and let $g: A \to A'$ be a bijection. Let $B$ be any set such that $\span(A) \cap \span(B) = \{ 0 \} = \span(A') \cap \span(B)$. Then there is an automorphism $f: \MM \to \MM$ that extends $g$.

Proof. Let $B_0$ be a basis for $\span(B)$. Then $A B_0$ and $A' B_0$ are both linearly independent sets. We can extend both of these sets to a basis for $\MM$. That is, there are $C$ and $C'$ such that $A B_0 C$ and $A' B_0 C'$ are bases for $\MM$. Because the dimension of $\MM$ is much bigger than any of the sets involved (so it is bigger than $|A A' B_0|$) we have that $|C| = |C'|$. So there is a bijection $h: A B_0 C \to A' B_0 C'$ that extends $g$ and is the identity on $B_0$. Since $h$ is a bijection of bases it extends to an automorphism $f: \MM \to \MM$. $\square$

Good independence

In what follows we will sometimes also write tuples where we would expect a subset. In such a case we just consider the tuple as a subset. For example, if $\bar{a} = (a_1, \ldots, a_n)$ then we might write $\bar{a} \ind_C B$ instead of $\{a_1, \ldots, a_n\} \ind_C B$.

We can now give a definition of what we will consider to be a good independence relation. So this will turn out to be the list of properties that characterises linear independence in vector spaces!

Definition. We call an independence relation $\ind$ good if it satisfies the following properties. In these statements $A, B, B', C, D$ are arbitrary sets and $\bar{a}$ is an arbitrary tuple

  • symmetry $A \ind_C B$ implies $B \ind_C A$,
  • monotonicity if $A \ind_C B$ and $B' \subseteq B$ then $A \ind_C B'$,
  • existence $A \ind_C C$,
  • finite character if $A \ind_C B_0$ for all finite $B_0 \subseteq B$ then $A \ind_C B$,
  • local character if $A$ is finite then there is finite $B_0 \subseteq B$ such that $A \ind_{B_0} B$,
  • base-monotonicity if $A \ind_C B$ then for any $C'$ with $C \subseteq C' \subseteq B$ we have $A \ind_{C'} B$.
  • transitivity $A \ind_C B$ and $A \ind_B D$ and $C \subseteq B$ implies $A \ind_C D$,
  • extension if $\bar{a} \ind_C B$ then for any $D$ there is $\bar{a}'$ with $\bar{a} \equiv_{BC} \bar{a}'$ such that $\bar{a}' \ind_C BD$,
  • stationarity if $\bar{a} \ind_C B$ and $\bar{a}'$ is such that $\bar{a}' \ind_C B$ and $\bar{a} \equiv_C \bar{a}'$ then also $\bar{a} \equiv_{BC} \bar{a}'$.

Of course, these properties are not chosen at random. Our linear independence relation $\ind^\lin$ will satisfy all of these properties, as we will see below. We will also see that verifying most these properties is really easy.

Proposition 1. The independence relation $\ind^\lin$ is a good independence relation.

Proof. The properties symmetry, monotonicity, existence, normality, finite character, local character and base-monotonicity are direct from the definition (using the first condition is the easiest here, except for base-monotonicity, where the second condition is easiest). We verify the remaining properties.

  • transitivity We need to prove that $A \ind^\lin_C B$ and $A \ind^\lin_B D$ and $C \subseteq B$ implies $A \ind^\lin_C D$. Since $C \subseteq B$ we have $\span(BC) = \span(B)$. Combining this with $A \ind_C^\lin B$ we get $\span(C) = \span(AC) \cap \span(BC) = \span(AC) \cap \span(B)$. We also have $\span(AB) \cap \span(BD) = \span(B)$ from $A \ind_B^\lin D$. So substituting this in the previous equation we get $\span(C) = \span(AC) \cap \span(AB) \cap \span(BD) = \span(AC) \cap \span(BD)$. In the last equation we used $C \subseteq B$ again, and we use it once more to see that $\span(CD) \subseteq \span(BD)$. Hence $\span(AC) \cap \span(CD) \subseteq \span(AC) \cap \span(BD) = \span(C)$. As we always have $\span(C) \subseteq \span(AC) \cap \span(CD)$ we conclude that $\span(AC) \cap \span(CD) = \span(C)$ and hence $A \ind^\lin_C D$ as required.
  • extension We need to prove that if $\bar{a} \ind^\lin_C B$ then for any $D$ there is $\bar{a}'$ with $\bar{a} \equiv_{BC} \bar{a}'$ such that $\bar{a}' \ind^\lin_C BD$. Let $C_0$ be a basis for $\span(C)$ and let $A_0$ be such that $C_0 A_0$ is a basis for $\span(C\bar{a})$. Let $A_0'$ be a set of linearly independent vectors outside of $\span(BCD)$ such that $|A_0'| = |A_0|$. This can be done because we work in a big enough monster. Because of $\bar{a} \ind^\lin_C B$ and our choice of $A_0'$ we have $\span(A_0) \cap \span(BC) = \{0\} \span(A_0') \cap \span(BC)$. So by the automorphism lemma there is an automorphism $f: \MM \to \MM$ sending $A_0$ to $A_0'$ while fixing $BC$. We set $\bar{a}' = f(\bar{a})$. So we indeed have $\bar{a} \equiv_{BC} \bar{a}'$. As $\span(C\bar{a}') = \span(A_0' C)$ we have $\bar{a}' \ind^\lin_C BD$ by our choice of $A_0'$.
  • stationarity We need to prove that if $\bar{a} \ind^\lin_C B$ and $\bar{a}'$ is such that $\bar{a}' \ind^\lin_C B$ and $\bar{a} \equiv_C \bar{a}'$ then also $\bar{a} \equiv_{BC} \bar{a}'$. Let $C_0$ be a basis for $\span(C)$ and let $A_0$ be such that $C_0 A_0$ is a basis for $\span(C\bar{a})$. Then because $\bar{a} \ind^\lin_C B$ we have $\span(A_0) \cap \span(BC) = \{0\}$. Because $\bar{a} \equiv_C \bar{a}'$ there is an automorphism $f: \MM \to \MM$ such that $f(\bar{a}) = \bar{a}'$ while fixing $C$. We set $A_0' = f(A_0)$ so $A_0' C_0$ is a basis for $\span(C\bar{a}')$. Using $\bar{a}' \ind^\lin_C B$ we then get $\span(A_0') \cap \span(BC) = \{0\}$. Let $g: A_0 \to A_0'$ be the restriction of $f$ to $A_0$, so $g$ is a bijection. We can thus apply the automorphism lemma to find an automorphism $h: \MM \to \MM$ that extends $g$ and is the identity on $BC$. In particular $h(\bar{a}) = \bar{a}'$ and so we conclude $\bar{a} \equiv_{BC} \bar{a}'$, as required.

This completes the proof. $\square$

Main result

We can now move on to our main result.

Theorem 2 (main result). Any good independence relation must be $\ind^\lin$.

Proof. Let $\ind$ be a good independence relation. By finite character and symmetry we can reduce the problem to proving that for any finite $A$ and any $B$ and $C$ we have $A \ind_C B$ if and only if $A \ind_C^\lin B$. To do this we introduce a technical independence relation $\ind^i$ below. We will then prove that for any good independence relation $\ind$ we have that $A \ind_C B$ if and only if $A \ind_C^i B$ for any finite $A$ and any $B$ and $C$ (Lemmas 3 and 6). Then, since $\ind^\lin$ is a good independence relation (Proposition 1) we get that $A \ind_C^\lin B$ iff $A \ind_C^i B$ iff $A \ind_C B$, as required. $\square$

In the definition below we will write something like $\bar{d} \in D$, which is maybe technically not entirely correct. It just means that $\bar{d}$ is a tuple with its elements in $D$. So if $\bar{d}$ were finite, say of length $n$, then $\bar{d} \in D^n$ would be more precise. However, we do not really care about the exact length of our tuples, so we just drop it from the notation.

Definition. We define an independence relation $\ind^i$ as follows. We let $\bar{a} \ind_C^i \bar{b}$ if for any $D$ there is some $\bar{a}'$ with $\bar{a}' \equiv_{C\bar{b}} \bar{a}$ such that for any $\bar{d}, \bar{d}' \in D$ with $\bar{d} \equiv_C \bar{d}'$ we have $\bar{a}'\bar{d} \equiv_C \bar{a}'\bar{d}'$.

The model theorists may recognise this as extending to an invariant type. That is, the above definition is saying exactly that $\tp(\bar{a}/C\bar{b})$ extends to a $C$-invariant type over $D$.

Lemma 3. Let $\ind$ be a good independence relation, then $\bar{a} \ind_C \bar{b}$ implies $\bar{a} \ind_C^i \bar{b}$.

Proof. Let $D$ be an arbitrary set. Apply extension to find $\bar{a}'$ with $\bar{a}' \equiv_{C\bar{b}} \bar{a}$ such that $\bar{a}' \ind_C D\bar{b}$. We claim that $\bar{a}'$ is as described in the definition of $\ind^i$. To verify this we let $\bar{d}, \bar{d}' \in D$ with $\bar{d} \equiv_C \bar{d}'$. By monotonicity we have $\bar{a}' \ind_C \bar{d}$ and $\bar{a}' \ind_C \bar{d}'$, so an application of stationarity directly gives us $\bar{a}'\bar{d} \equiv_C \bar{a}'\bar{d}'$, as required. $\square$

The other direction is a bit more complicated and involves the idea of a sequence where every next tuple in the sequence is independent from all the previous tuples.

Definition. Given a good independence relation $\ind$ and any set $C$ we say that a sequence $(\bar{b}_n)_{n \in \N}$ is a $\ind_C$-independent sequence if $\bar{b}_n \ind_C C\bar{b}_{< n}$ for all $n \in \mathbb{N}$. Here $\bar{b}_{< n}$ is an abbreviation for $(\bar{b}_k)_{k < n}$.

If we apply the above definition to $\ind^\lin$ we find back the notion of a linearly independent set. More precisely, $(b_n)_{n \in \N}$ is a $\ind_\emptyset^\lin$-independent sequence iff $\{b_n : n \in \N\}$ is linearly independent. So at least in this very specific case independent sequences exist, but it turns out that we can always construct them for any good independence relation.

Lemma 4. Let $\ind$ be a good independence relation, let $\bar{b}$ be any tuple and let $C$ be any set. Then there is a $\ind_C$-independent sequence $(\bar{b}_n)_{n \in \N}$ with $\bar{b}_0 = \bar{b}$ and $\bar{b}_n \equiv_C \bar{b}$ for all $n \in \N$.

ProofWe construct the sequence inductively. We start with $\bar{b}_0 = \bar{b}$ of course. Then $\bar{b}_0 \ind_C C$ by the existence property, so this covers the base case. Now for the inductive step, suppose that $(\bar{b}_k)_{k \leq n}$ has been constructed. Again using existence we have $\bar{b} \ind_C C$, to which we can apply extension to find $\bar{b}'$ with $\bar{b}' \equiv_C \bar{b}$ and $\bar{b}' \ind_C C(\bar{b}_k)_{k \leq n}$. We set $\bar{b}_{n+1} = \bar{b}'$, which completes the induction step and with that the proof is complete. $\square$

Lemma 5. Let $\ind$ be a good independence relation and let $(\bar{b}_n)_{n \in \N}$ be a $\ind_C$-independent sequence. For any finite $\bar{a}$ there is $n \in \N$ such that $\bar{a} \ind_C \bar{b}_n$.

Proof. By local character there are finite $C_0 \subseteq C$ and $B_0 \subseteq (\bar{b}_n)_{n \in \N}$ such that $\bar{a} \ind_{B_0 C_0} C(\bar{b}_n)_{n \in \N}$. Since $B_0$ is finite there is some $n \in \N$ such that $B_0 \subseteq \bar{b}_{<n}$. By base-monotonicity we then have $\bar{a} \ind_{C \bar{b}_{<n}} C(\bar{b}_n)_{n \in \N}$. By monotonicity we then get $\bar{a} \ind_{C \bar{b}_{<n}} \bar{b}_n$ and then by symmetry we find $$\bar{b}_n \ind_{C \bar{b}_{<n}} \bar{a}.$$ Using the assumption that $(\bar{b}_n)_{n \in \N}$ is a $\ind_C$-independent sequence we also get $$\bar{b}_n \ind_C C \bar{b}_{<n}.$$ So we can apply transitivity to obtain $\bar{b}_n \ind_C \bar{a}$ and the result follows after applying symmetry once more. $\square$

The following lemma is the last missing piece in the proof of our main result.

Lemma 6. Let $\ind$ be a good independence relation, then for any finite $\bar{a}$ we have that $\bar{a} \ind_C^i \bar{b}$ implies $\bar{a} \ind_C \bar{b}$.

Proof. By Lemma 4 there is a $\ind_C$-independent sequence $(\bar{b}_n)_{n \in \N}$ with $\bar{b}_0 = \bar{b}$ and $\bar{b}_n \equiv_C \bar{b}$ for all $n \in \N$. We apply the definition of $\ind^i$ with $(\bar{b}_n)_{n \in \N}$ in the role of $D$. We thus get some $\bar{a}'$ with $\bar{a}' \equiv_{C\bar{b}} \bar{a}$. By Lemma 5 there must be some $n \in \N$ such that $\bar{a}' \ind_C \bar{b}_n$. As $\bar{b}_n \equiv_C \bar{b}$ we have by our choice of $\bar{a}'$ that $\bar{a}' \bar{b}_n \equiv_C \bar{a}' \bar{b} \equiv_C \bar{a} \bar{b}$, where the last equivalence follows from $\bar{a}' \equiv_{C\bar{b}} \bar{a}$. So by invariance we get $\bar{a} \ind_C \bar{b}$, as required. $\square$

React





Reactions

There are no reactions (yet) for this blog post.