Monday, October 24, 2016

10/22/2016 UIC Workshop Day 3

Starting off this day, Justin Moore finished up his tutorial. Today was more about generating certain kinds of trees,  in such a way that CH is not violated. This, of course, continues the theme of the previous day, as the existence of certain trees is known to follow from MA and be denied by diamond. So if one of these can be forced to exist while maintaining CH, it provides evidence that diamond was necessary. In particular, Moore covered some of the details of a forcing which is completely proper, adds a special tree, and is still completely proper in the extension. It's interesting to me that when I read through the material on trees in Kunen that I mistakenly thought of it as a kind of curiosity and as mostly completed work. I was way wrong, but I'm happy that I was, as the discussion of these tree properties, square properties, and other combinatorial properties generates a lot of interesting mathematics.

Anush Tserunyan was the next speaker and her talk was on "Integer Cost and Ergodic Actions." She spent a little while acclimating the group to the ideas of invariant descriptive set theory, and although that was review for me, her perspective and intuitions were helpful. In particular, she focused on the interplay between group actions, equivalence relations, and graphs. These connections provide background to try and answer the question "What is most efficient presentation of an equivalence relation E?" and "What is the smallest number of edges required to create a graph G which understands E?" Towards a qualitative answer to these questions, you can see if there is an acyclic Borel graph which captures the equivalence relation. A good example here is the equivalence relation generated from the free action of a free group. More quantitatively, using a measure you can integrate the number of edges then take an infimum and obtain a cost function. Galorian showed in 1998 that this infimum is achieved precisely when the equivalence relations can be captured by an acyclic Borel graph. Hjorth continued in this vein, looking more specifically at countable ergodic measure preserving equivalence relations. If the cost is an integer or infinite, then the equivalence relation can be seen as generated by the free action of a free group with at most countably many generators. Anush created tools which not only simplify the proof of this substantially, but allow her to strengthen its conclusion to add that the subactions generated by individual generators is ergodic as well. Anush is always a good speaker, and since I normally see her speak at descriptive set theory sessions, it was interesting to see how she catered her material to a more pure set theory audience. Her ability to live in both the world of set theory and that of analysis is something I strive for. As a final note, something in her discussion of these graph representations shook loose some ideas I've been having about the qualitative characterizations of various quotients of the reals. Even though E_0 cannot be linearly ordered, it can be represented by an acyclic graph, and in that sense its partial order structure is less complicated than say its product with equivalence relation generated by the free group on two generators acting freely. So it seems to separate E_0 and l_1 qualitatively I should be looking at ways in which they can be represented.

In the afternoon, Matt Foreman did the first two parts of his tutorial, "Applications of Descriptive Set Theory to Classical Dynamical Systems." This was based on the same slides as when I saw Foreman speak at CUNY, although this time he catered the talk an audience of peers as opposed to graduate students. As such, he included a number of insights that were in the CUNY talk. Also, he had more time here, so he was able to go into more detail. As before, the problem is to see if it possible to classify the ergodic measure preserving diffeomorphisms of compact manifolds. This was problem was originally posed by Von Neumann in the 1930s. You can argue that attempts to solve are in the background of many of the significant developments in dynamics, such as the notion of entropy, although its not clear that the field is still motivated by it today. Regardless, finding a solution to an 80 year problem is impressive. The answer is no, there is no simple way to classify the ergodic measure preserving diffeomorphisms of compact manifolds. More precisely, this problem is more complicated than the most complicated problems which have algebraic invariants: those isomorphisms which can be seen as actions of the infinite permutation group. If you drop the diffeomorphism aspect, and just look at ergodic measure preserving transformations, the problem is complete analytic, which is as bad it could be. This uses the fact that odometer based dynamical systems are in some sense generic in the space of dynamical systems. Foreman and Weiss however, really wanted to solve Von Neumann's problem. To do this, they created a new type of dynamical system which is motivated by odometer systems, called circular systems. While it is still unknown if an arbitrary odometer system can be realized as an ergodic measure preserving diffeomorphisms they were able to show that circular systems can be. There is also a nice isomorphism between the class of odometer based systems and the class of circular based systems. Putting this altogether you can get that the diffeomorphism problem is also complete analytic.

Finishing off the third day, Maxwell Levine spoke on "Weak Squares and Very Good Scales." I saw a twenty minute version of this talk at the CUNY graduate student conference. When I saw it there I knew absolutely no pcf theory, and I had a hard time tracking what was happening. Now that I've done the pcf summer school and Maxwell had an hour, I think I got a much better idea of what he is doing. There are three kinds of combinatorial principles in play in his work: square principles, stationary reflection, and scales. Stationary reflection is a kind of compactness property, and can be provided by large cardinals. Square principles are a kind of incompactness; they occur in models like L, and for example can provide for the existence of an almost metrizable space which is not itself metrizable. A theorem of Shelah says that scales exist, but the existence of a very good scale contradicts stationary reflection. What's somewhat odd is that square properties can co-exist with stationary reflection. A strong square property can coexist with stationary reflection. The weakest square property in particular can even coexist with simultaneous stationary reflection, but it is not immediately clear if the second weakest square can. One way to check this is to see if this second weakest square implies the existence of a very good scale. Maxwell was able to construct a model where the weakest square holds and there are no very good scales, however the second weakest square still fails in this model. So he next tried to see if it is possible for the second weakest square to strictly hold. The answer is yes, and it can even happen at relatively small cardinals (aleph_omega). Complicating the picture even more, in the presence of large cardinals this second weakest square property implies that the strongest form of simultaneous stationary reflection fails. In fact, Maxwell is able to explicitly characterize why this reflection is failing. What I like about this work now is that Maxwell is basically taking the theory of cardinal arithmetic apart piece by piece and seeing which steps were really necessary. At the very least I feel like I have a better understanding of the machinery of cardinal arithmetic after seeing him speak.

No comments:

Post a Comment