Department of Mathematics,
University of California San Diego
****************************
Special Geometry
Pengzi Miao
Stanford University
Mass, quasi-local mass and static metric extension in general relativity
Abstract:
We will first discuss a generalized Positive Mass Theorem on a class ofpiecewise smooth asymptotically flat manifolds with broken mean curvatureacross a hypersurface. Then we will relate it to Bartnik's quasi-localmass definition and explain how Corvino's scalar curvaturedeformation theorem implies that a minimal mass extension, if exists,must be static. Finally, we will prove that, for any metric thatis close enough to the Euclidean metric on a ball and has reflectioninvariant boundary data, there always exists an asymptotically flat, scalar flat and static metric extension with Bartnik's geometric boundarycondition.
-
AP&M 7321
AP&M 7321
****************************
Department of Mathematics,
University of California San Diego
****************************
Math 292 - Symplectic Topology
Justin Roberts
UCSD
Quantization, characters and asymptotics III
-
AP&M 7218
AP&M 7218
****************************
Department of Mathematics,
University of California San Diego
****************************
Math 278 - Numerical Analysis
Liz Fenwick
UCSD Graduate Student
Anisotropic Feature-Preserving Denoising of Height Fields and Bivariate Data
-
AP&M 7321
AP&M 7321
****************************
Department of Mathematics,
University of California San Diego
****************************
Math 288 - Special Statistics
Beth Andrews
Colorado State University
Maximum Likelihood and Rank Estimation for All-Pass Time Series Models
Abstract:
All-pass models are autoregressive-moving average models in which the roots of the autoregressive polynomial are reciprocals of roots of the moving average polynomial and vice versa. They generate uncorrelated (white noise) time series, but these series are not independent in the non-Gaussian case. Because all-pass series are uncorrelated, estimation methods based on Gaussian likelihood, least-squares, or related second-order moment techniques cannot identify all-pass models. Consequently, I use maximum likelihood and rank techniques to obtain parameter estimates. Maximum likelihood estimation has already been studied for autoregressive-moving average models. However, the parameters in the autoregressive polynomial of an all-pass model are functions of parameters in the moving average polynomial and vice versa, so the results for autoregressive-moving average models cannot be used for all-pass models. I discuss asymptotic properties of the two types of estimators, examine their behavior for finite samples via simulation, and consider an application for all-pass models--fitting noninvertible moving average models (known as nonminimum phase models in the engineering literature). I apply the results to stock market data. This is joint work with Jay Breidt and Richard Davis.
-
AP&M 6438
AP&M 6438
****************************
Department of Mathematics,
University of California San Diego
****************************
Math 258 - Differential Geometry
Yu Ding
UC Irvine
Analysis on tangent cones
-
AP&M 6438
AP&M 6438
****************************
Department of Mathematics,
University of California San Diego
****************************
Math 269 - Special Combinatorics
Glenn Tesler
UCSD
Genome Rearrangements in Mammalian Evolution: Lessons from Human andMouse Genomes
Abstract:
Although analysis of genome rearrangements was pioneered by Dobzhanskyand Sturtevant 65 years ago, we still know very little about therearrangement events that produced the existing varieties of genomicarchitectures. The genomic sequences of human and mouse provideevidence for a larger number of rearrangements than previouslythought. We describe a new algorithm for constructing synteny blocks,study arrangements of synteny blocks in human and mouse, derive a mostparsimonious human-mouse rearrangement scenario, and provide evidencethat intrachromosomal rearrangements are more frequent thaninterchromosomal. Our analysis is based on the human-mouse breakpointgraph, which reveals related breakpoints and allows one to find a mostparsimonious scenario. We also provide the first evidence that thewidely accepted Nadeau-Taylor model of chromosomal rearrangements mustbe revised, in view of details that were not visible prior to theavailability of high-resolution genomic sequences.Potential recruitment candidate for Bioinformatics
-
AP&M 6438
AP&M 6438
****************************
Department of Mathematics,
University of California San Diego
****************************
Special Colloquium
Arshak Petrosyan
University of Texas
Obstacle and Stefan type problems with no sign restriction
Abstract:
We show how to prove the regularity of free boundaries in these classical problems even if one drops the nonnegativity assumption on the solution (and on its time derivative in the case of Stefan problem). This involves the application of two different kinds of monotonicity formulas: one due to L. Caffarelli to prove the regularity of the solution, and the other due to G. Weiss to classify the free boundary points by their homogeneity properties.
-
AP&M CHANGED TO
AP&M CHANGED TO
****************************
Department of Mathematics,
University of California San Diego
****************************
Special Statistics
Jiashun Jin
Stanford University
Detecting and Estimating Sparse Mixtures
Abstract:
Sparse Mixture Models have important applications in many areas, such as Signal and Image Processing, Genomics, Covert Communication, etc. In my talk, I will consider the problems of detecting and estimating sparse mixtures.Detection: Higher Criticism is a statistic inspired by a multiple comparisons concept mentioned in passing by Tukey (1976) (but as a term: Higher Criticism is invented by a German historian Johann Eichhorn (1787)). We are able to show that the resulting : Higher Criticism Statistic is effective at resolving a very subtle testing problem: testing whether $n$ normal means are all zero versus the alternative that a small fraction is nonzero; the subtlety of this `sparse normalmeans' testing problem can be seen from work of Ingster (1999) and Jin(2002), who studied such problems in great detail. In their studies, they identified an interesting range of cases where the small fraction of nonzero means is so small that the alternative hypothesis exhibits little noticeable effect on the distribution on the $p$-values either for the bulk of the tests or for the few most highly significant tests. In this range, when the amplitude of nonzero means is calibrated with the fraction of nonzero means, the likelihood ratio test for a precisely specified alternative would still succeed in separating the two hypotheses. We show that the higher criticism is successful throughout the same region of amplitude vs. sparsity where the likelihood ratio test would succeed. Since it does not require a specification of the alternative, this shows that Higher Criticism is in a sense optimally adaptive to unknown sparsity and size of the non-null effects. While our theoretical work is largely asymptotic, we provide simulations in finite samples. We also show Higher Criticism works very well over a range of non-Gaussian cases.Estimation: False Discovery Rate (FDR) control is a recent innovation in multiple hypothesis testing, in which one seeks to ensure that at most a certain fraction of the rejected null hypotheses correspond to false rejections (i.e. false discoveries). The FDR principle also can be used in highly multivariate estimation problems, where it has recently been shown to provide an asymptotically minimax solution to the problem of estimating a sparse mean vector in the presence of Gaussian white noise. In effect, FDR provides an effective method of setting a threshold for separating signal from noise when the signal is sparse and the noise is Gaussian. In this talk we consider the application of FDR thresholding to non-Gaussian settings, in hopes of learning whether the good asymptotic properties of FDR thresholding as an estimation tool hold more broadly than just at the standard Gaussian model. We study sparse exponential model and sparse Poisson model, which are important models for non-Gaussian data, and have applications in many areas as well, such as Astronomy and Positron Emission Tomography (PET) etc. We show that the FDR principle also provide an asymptotically minimax solution to the problem of estimating a sparse mean vector even in the presence of exponential/Poisson noise, and in effect FDR provides an effective method of setting a threshold for separating signal from noise when the signal is sparse and the noise is exponential/Poisson. We compare our results with work in the Gaussian setting by Abramovich, Benjamini, Donoho, Johnstone (2000).Joint work with David L. Donoho.
-
AP&M 6438
AP&M 6438
****************************
Department of Mathematics,
University of California San Diego
****************************
Math 288 - Probability
V. Rotar
SDSU
On local dependency on graphs, the CLT, and Stein's method
-
AP&M 6438
AP&M 6438
****************************
Department of Mathematics,
University of California San Diego
****************************
Math 292 - Special Topology/Geometry
Grigory Mikhalkhin
University of Utah
Toric Surfaces, Gromov-Witten and tropical algebraic geomerty
Abstract:
The talk presents a new formula for the Gromov-Witten invariants ofarbitrary genus in the projective plane as well as for the relatedenumerative invariants in other toric surfaces. The answer is given interms of certain lattice paths in the relevant Newton polygon. Thelength of the paths turns out to be responsible for the genus of theholomorphic curves in the count. The formula is obtained by working interms of the so-called tropical algebraic geometry. This version ofalgebraic geometry is simpler than its classical counterpart in manyaspects. In particular, complex algebraic varieties themselves becomepiecewise-linear objects in the real space. The transition from theclassical geometry is provided by consideration of the "large complexlimit" (which is also known as "dequantization" or "patchworking" insome other areas of Mathematics).
-
AP&M 7321
AP&M 7321
****************************
Department of Mathematics,
University of California San Diego
****************************
Mathematics Colloquium
Ivan Shestakov
University of Sao Paulo, Brazil
The Nagata automorphism is wild
Abstract:
It is well-known that the automorphisms of polynomial rings and free associative algebras in two variables are "tame", that is, they admit a decomposition into a product of linear automorphisms and the automorphisms of the type $(x,y)mapsto (x,y+f(x))$. However, in the case of three or more variables the similar question was open and known as ``The generation gap problem" or ``The tame generators problem". In 1972 Nagata constructed a certain automorphism of the polynomial ring in three variables and conjectered that it is non-tame or "wild". The purpose of the present work is to confirm the Nagata conjecture. Our main result states that the tame automorphisms of the polynomial ring in three variables over a field of characteristic $0$ are algorithmically recognizable. In particular, the Nagata automorphism is wild.
-
AP&M 6438
AP&M 6438
****************************
Department of Mathematics,
University of California San Diego
****************************
Math 288 - Statistics
Arniban DasGupta
UCSD Visitor
Matching problems and Random Permutations
-
AP&M 5829
AP&M 5829
****************************

