The work of Misha Gromov has revolutionized geometry in many respects, but at the same time introduced a geometric point of view in many questions. His impact is very broad and one can say without exaggeration that many fields are not the same after the introduction of Gromov's ideas.I will try and explain several avenues that Gromov has been pursuing, stressing the changes in points of view that he brought in non-technical terms.Here is a list of topics that the lecture will touch:
Symplectic topology can be thought as the mathematical versant of String theory: they were born independently at the same time, the second one as a fantastic enterprise to unify large-scale and low-scale physics, and the first one to solve classical dynamical problems on periodic orbits of physical problems, the famous Arnold conjectures. In the 80's, Gromov's revolutionary work opened a new perspective by presenting symplectic topology as an almost Kähler geometry (a concept that he defined), and constructing the corresponding theory which is entirely covariant (whereas algebraic geometry is entirely contravariant). A few years later, Floer and Hofer established the bridge between the two interpretations of Symplectic topology, the one as a dynamical theory and the one as a Kähler theory. This bridge was confirmed for the first time by Lalonde-McDuff who related explicitly the first theory to the second by showing that Gromov's Non-Squeezing Theorem is equivalent to Hofer's energy-capacity inequality.
Nowadays, Symplectic Topology is a very vibrant subject, and there is perhaps no other subject that produces new and deep moduli spaces at such a pace ! More recent results will also be presented.
Avi Wigderson is a widely recognized authority in theoretical computer science. His main research area is computational complexity theory. This field studies the power and limits of efficient computation and is motivated by such fundamental scientific problems as: Does P=NP? Can every efficient process be efficiently reversed? Can randomness enhance efficient computation? Can quantum mechanics enhance efficient computation? He has received, among other awards, both the Nevanlinna Prize and the Gödel Prize.
What protects your computer password when you log on, or your credit card number when you shop on-line, from hackers listening on the communication lines? Can two people who never met create a secret language in the presence of others, which no one but them can understand? Is it possible for a group of people to play a (card-less) game of Poker on the telephone, without anyone being able to cheat? Can you convince others that you can solve a tough math (or SudoKu) puzzle, without giving them the slightest hint of your solution?These questions (and their remarkable answers) are in the realm of modern cryptography. In this talk I plan to survey some of the mathematical and computational ideas, definitions and assumptions which underlie privacy and security of the Internet and electronic commerce. We shall see how these lead to solutions of the questions above and many others. I will also explain the fragility of the current foundations of modern cryptography, and the need for stronger ones.No special background will be assumed.
Abstract: We all Google. You may even have found this talk by Googling. What you may not know is that behind the Google's and other search engines is beautiful and elegant mathematics. In this talk, I will try to explain the workings of page ranking and search engines using only rusty calculus.
Bio: Dr. Margot Gerritsen is the Director of the Institute for Computational and Mathematical Engineering at Stanford University. She is also the chair of the SIAM Activity group in Geoscience, the co-director and founder of the Stanford Center of Excellence for Computational Algorithms in Digital Stewardship, and the director of Stanford Yacht Research. She has been appointed to several prestigious positions, including Magne Espedal Professorship at Bergen University, Aldo Leopold Fellow, Faculty Research Fellow at the Clayman Institute and she is also a Stanford Fellow. She is the editor of the Journal of Small Craft Technology and an associate editor of Transport in Porous Media. We are delighted to have Dr. Gerritsen participate in the Mathematics of Planet Earth series.
Large scale production of very heavy oil is gaining momentum because of the decline of easy to produce reservoirs, the increasing oil demand and subsequent rising oil price, which makes such resources more economical. Considering the pressure on the oil market and our still very heavy dependence on oil, this move to heavy oil production seems inevitable. Typically, heavy oil reservoirs are stimulated thermally. Injecting steam that is generated at the surface is not always viable or desirable. An alternative technique for production is In-Situ Combution (ISC) where a steam drive is generated in the reservoir itself. In this process, (enriched) air is injected in the reservoir. After ignition a combustion front develops in-situ that burns a small percentage of the oil in place and slowly moves through the reservoir producing steam along the way. A side benefit of this process is that the heat thus generated often cracks the oil into heavy, undesirable components (the "guck") that stay behind in the reservoir and lighter, more valuable components that can be brought up to the surface. Performance prediction of ISC projects is rather tricky and poses many computational challenges. In this talk I'll discuss our work in ISC simulation, which is centered around the design of upscaling methods for kinetics and critical reservoir heterogeneities supported by laboratory experimentation.
The first duty of any epidemiologist is to ask a relevant
question. Learning and applying sophisticated epidemiologic methods is
of little help if the methods are used to answer irrelevant questions.
This talk will discuss the formulation of research questions in the
presence of time-varying treatments and treatments with multiple
versions, including pharmacological treatments and lifestyle
exposures. Several examples will show that discrepancies between
observational studies and randomized trials are often not due to
confounding, but to the different questions asked.
Miguel Hernán is Professor of Department of Epidemiology and Department of Biostatistics at the Harvard School of Public Health (HSPH). His research is focused on the development and application of causal inference methods to guide policy and clinical interventions. He and his collaborators apply statistical methods to observational studies under suitable conditions to emulate hypothetical randomized experiments so that well-formulated causal questions can be investigated properly. His research applied to many areas, including investigation of the optimal use of antiretroviral therapy in patients infected with HIV, assessment of various interventions of kidney disease, cardiovascular disease, cancer and central nervous system diseases. He is Associate Director of HSPH Program on Causal Inference in Epidemiology and Allied Sciences, member of the Affiliated Faculty of the Harvard-MIT Division of Health Sciences and Technology, and an Editor of the journal EPIDEMIOLOGY. He is the author of upcoming highly anticipated textbook "Causal Inference" (Chapman & Hall/CRC, 2013), drafts of selected chapters are available on his website.
Bootstrap percolation, one of the simplest cellular automata, can be viewed as an oversimplified model of the spread of an infection on a graph. In the past three decades, much work has been done on bootstrap percolation on finite grids of a given dimension in which the initially infected set A is obtained by selecting its vertices at random, with the same probability p, independently of all other choices. The focus has been on the critical probability, the value of p at which the probability of percolation (eventual full infection) is 1/2.
The first half of my talk will be a review of some of the fundamental results concerning critical probabilities proved by Aizenman, Lebowitz, Schonman, Cerf, Cirillo, Manzo, Holroyd and others, and by Balogh, Morris, Duminil-Copin and myself. The second half will about about the very recent results I have obtained with Holmgren, Smith, Uzzell and Balister on the time a random initial set takes to percolate.