CUNY Probability Seminar, Fall 2023

The CUNY Probability Seminar will have both in-person and online talks. Its usual time will be Tuesdays from 4:15 to 5:15 pm EST. The exact dates, times, and locations are mentioned below. If you are interested in speaking at the seminar or would like to be added or to be removed from the seminar mailing list, then please get in touch with either of the Seminar Coordinators Matthew Junge and Emma Bailey

Seminar Schedule:

The seminar meets on Tuesdays from 4:15 to 5:15 pm EST, beginning September 12th.

The zoom link, when applicable, will be sent out via the CUNY Probability Seminar listserv. If you are not on the mailing list, please get in contact with the seminar organisers to receive the joining information.

Time: September 12, 4:15 – 5:15 pm EDT
Speaker: Matt Junge
Location: The Graduate Center, Room 6417
Title: The frog model on trees

Abstract: The frog model features random activation and spread. Think combustion or an epidemic. I have studied these dynamics on d-ary trees for ten years. I will discuss our progress and what remains to be done.

Time: September 19, 4:15 – 5:15 pm EDT
Speaker: Arthur Jacot
Location: The Graduate Center, Room 6417
Title: Implicit Bias of Large Depth Networks: the Bottleneck Rank

Abstract: To understand Deep Neural Networks (DNNs), we need to understand their implicit bias, i.e. the types of functions that they can learn efficiently. I will argue that large depth DNNs are biased towards learning functions that can be decomposed as a first function mapping from the input space to a low-dim `latent space’ and a second function mapping from this latent space to the outputs. This bias is the result of a Bottleneck structure in the learned features of the network, where the representations of `almost all’ layers of the network are approximately k-dim for some integer k. Learning such low-dim representations can also be interpreted as the network learning symmetries of the task, which could explain the success of DNNs or
image or language tasks which have a lot of symmetries. But these feature compression abilities come with dangers too, where the network can underestimate the inner dimension k, or equivalently learn spurious symmetries that are not actual symmetries of the task.
But these issues appear rare in practice, as supported by theoretical evidence.

 

Time: September 26, 4:15 – 5:15 pm EDT
Speaker: Alper Gunes
Location: The Graduate Center, Room 6417
Title: Moments of characteristic polynomials of random matrices, Painlevé equations, and L-functions
Abstract: In this talk, we will consider various different types of joint moments of characteristic polynomials of random matrices that are sampled according to the Haar measure on classical compact groups. In each case, we will see how one can obtain the asymptotics of these quantities as the matrix size tends to infinity, and see the implications that these asymptotics have regarding moments of L-functions. Finally, we will see that these asymptotics have representations in terms of solutions of certain Painleve equations, giving us conjectures relating L-functions and solutions of these Painleve systems. Based on a joint work with Assiotis, Bedert and Soor, and some original results obtained as a part of my undergraduate thesis.

Time: October 3, 4:15 – 5:15 pm EDT
Speaker: Sayan Banerjee
Location: The Graduate Center, Room 6417
Title: Exploration-driven networks

Abstract: We propose and investigate a class of random networks where incoming vertices locally traverse the network in the direction of the root for a random number of steps before attaching to the terminal vertex. Specific instances of these networks correspond to uniform attachment, linear preferential attachment and attachment with probability proportional to vertex Page-ranks. We obtain local weak limits for such networks and use them to derive asymptotics for the limiting empirical degree and PageRank distribution. We also quantify asymptotics for the degree and PageRank of fixed vertices, including the root, and the height of the network. Two distinct regimes are seen to emerge, based on the expected exploration distance of incoming vertices, which we call the ‘fringe’ and ‘non-fringe’ regimes. These regimes are shown to exhibit different qualitative and quantitative properties. In particular, networks in the non-fringe regime undergo ‘condensation’ where the root degree grows at the same rate as the network size. Networks in the fringe regime do not exhibit condensation. A non-trivial phase transition phenomenon is also displayed for the PageRank distribution, which connects to the well-known power-law hypothesis. Joint work with Shankar Bhamidi and Xiangying (Zoe) Huang.

Time: October 10, 4:15 – 5:15 pm EDT
No seminar: classes follow a Monday schedule

Time: October 17, 4:15 – 5:15 pm EDT
Speaker: Paul Jung
Location: The Graduate Center, Room 6417
Title:  TBD

Abstract: TBD

Time: October 24, 4:15 – 5:15 pm EDT
Speaker: Morris Ang
Location: The Graduate Center, Room 6417
Title:  TBD

Abstract: TBD

Time: October 31, 4:15 – 5:15 pm EDT
Speaker: Andrea Ottolini
Location: The Graduate Center, Room 6417
Title:  TBD

Abstract: TBD

Time: November 7, 4:15 – 5:15 pm EDT
Speaker: Brad Rodgers
Location: The Graduate Center, Room 6417
Title:  TBD

Abstract: TBD

Time: November 14, 4:15 – 5:15 pm EDT
No seminar: North-East Probability Seminar week

Time: November 21, 4:15 – 5:15 pm EDT
Speaker: Jonas Arista
Location: Online seminar
Title:  TBD

Abstract: TBD

Time: November 28, 4:15 – 5:15 pm EDT
Speaker: Shirshendu Chatterjee
Location: The Graduate Center, Room 6417
Title:  TBD

Abstract: TBD

Time: December 5, 4:15 – 5:15 pm EDT
Speaker: Jinyoung Park
Location: The Graduate Center, Room 6417
Title:  TBD

Abstract: TBD