# Invited Talk Abstracts(Click to show/hide)

**Venkat Anantharam, UC Berkeley**

### Steiner's formula and large deviations theory

The Shannon capacity of additive noise channels with transmitters whose codebooks are constrained by unconventional constraints, such as those that arise in energy harvesting communications, motivates the study of Steiner's formula for the volume of the parallel body of a convex body in high dimensions, in the Shannon regime. More precisely, one considers parallel bodies of a sequence of convex bodies, one in each dimension, where the volume scales exponentially with the dimension. A remarkable parallel with large deviations theory emerges in this regime. This parallel will be highlighted and the underlying applications that motivate this viewpoint will be discussed.

joint work with Varun Jog.

Click here for the presentation.

**Bartlomiej Blaszczyszyn, Ecole Normale Superieure, France**

### On a coverage problem in communications and its relations to Poisson-Dirichlet point processes

The SINR coverage process was introduced in the stochastic geometric context in 2001 to model coverage phenomena in wireless communications.Since that time it received a lot of attention, both in engineering and applied probability community. It is called the shot-noise coverage process in the recent edition of the famous book on Stochastic Geometry and Its Applications. In our lecture, first we will briefly present a few historical (and perhaps historic) results regarding this model, in particular in respect of its typical cell coverage and percolation. Then, we will focus on some more recent developments related to the stationary coverage properties of the model. In particular, we will show relations to the so called two-parameter family of Poisson-Dirichlet point processes, which appear also in economics and physics in several other, apparently different, contexts.

Click here for the presentation of first session.

Click here for the presentation of second session.

**Laurent Decreusefond , Telecom Paris Tech, France**

### The Stein-Dirichlet-Malliavin method and applications to stochastic geometry

We show how Malliavin calculus can enhance the general procedure of the Stein's method. The versatility of this new approach is illustrated on two examples from stochastic geometry: superposition of determinantal point processes and Poisson polytopes. This talk is based on arXiv:1207.3517, arXiv:1406.5484 and some new results still under development.

Click here for the presentation.

**Harpreet Dhillon, Virginia Tech, USA**

### Modeling and Analysis of Clusters and Exclusion Zones in D2D Networks

Device-to-device (D2D) networks enable direct communication between proximate mobile devices, thereby improving spectral efficiency and offloading traffic from cellular networks. As is the case in any wireless network, accurate spatial modeling of D2D networks is necessary for a meaningful performance analysis. In this talk, we present a new comprehensive model for these networks with special emphasis on two spatial characteristics: (i) devices engaged in D2D communication typically exist in clusters, e.g., users in a coffee shop or a sports bar, and (ii) interference management may introduce exclusion zones in which D2D communication is not allowed. We first propose a new tractable approach to capturing exclusion zones in an otherwise homogeneous interference field. As an application of the proposed approach, we derive a meaningful bound on the Laplace transform of interference originating from a Poisson Hole Process. Now modeling D2D network as a Thomas cluster process, we derive easy-to-use expressions for both coverage probability and area spectral efficiency (ASE) while assuming that each device has content of interest available with another device in the same cluster. These results provide several key insights into the optimal content placement that are unique to clustered D2D networks. The talk is concluded by enriching the clustered D2D model with exclusion zones and studying their effect on the performance analysis.

**Gerhard Fettweis and Vinay Suryaprakash**

### When Base Stations Meet Mobile Terminals, and Some Results Beyond

Wireless networks are designed based on population patterns. To densify networks, the organization of a cellular network is, therefore, done according to user density to minimize the cost of investment. Since populations do not appear in the form of a regular geometric grid, the planning of networks should not either. Instead, a two-dimensional Poisson process is believed to be a good match of reality. This has been the basis for groundbreaking work by various researchers showing how stochastic geometry based on independent Poisson point processes can be applied. However, to minimize cost and maximize capacity, cellular networks are densified wherever the user density is high. Hence, modelling users and base stations as independent Poisson processes does not match reality, but instead, generates "worst case " scenarios and therefore, weak lower bounds. Instead, the user and base station processes must reflect this correlation to be able to model reality suitably. In this talk, we shall describe our efforts towards realizing this goal. During our investigations, we find that using a Neyman-Scott process to model users clustered around base stations could be a viable alternative.

When densifying cellular networks even further, a hierarchy of cells is used, i.e. micro base stations are placed where capacity hot spots appear. Once again, correlating one process with the other (i.e. introducing a correlation among the base station processes) is pivotal for capturing reality accurately. To this end, we shall talk about our work on models using stationary Poisson cluster processes and how they can be used to study such networks.

Lastly, if time permits, we then will also talk about linear algebra and matrix theory; hopefully extending the view of what is known today.

Click here for the presentation.

**Massimo Franceschetti, University of California-San Diego, USA**

### Information without rolling dice

The deterministic notions of capacity and entropy are studied in the context of communication and storage of information using square-integrable, bandlimited signals subject to perturbation. The (ε,δ)-capacity, that extends the Kolmogorov ε-capacity to packing sets in high dimensions with overlap at most δ, is introduced and compared to the Shannon capacity. The functional form of the results indicates that in both Kolmogorov and Shannon's settings, capacity and entropy grow linearly with the number of degrees of freedom,but only logarithmically with the signal to noise ratio. This insight transcends the details of the stochastic or deterministic description of the information-theoretic model.

For δ=0, we provide new bounds on the Kolmogorov ε-capacity, and a tight asymptotic expression of the Kolmogorov ε-entropy of bandlimited signals. We also introduce a deterministic notion of error exponent, answering a question originally posed to the speaker by Francois Bacelli.

This is joint work with Taehyung J. Lim.

Click here for the presentation.

**Martin Haenggi, University of Notre Dame, USA**

### Asymptotics and Meta-Distribution of the Signal-to-Interference Ratio in Wireless Networks

"The signal-to-interference ratio (SIR) is a key quantity in wireless networks that determines the link reliability, rate, and delay. Its distribution function is usually interpreted as the outage probability of a link, and its complement is the success probability.

In the first part of this lecture, we focus on Poisson bipolar networks and determine the joint outage probability of multiple transmissions over the typical link. To obtain more fine-grained information about the individual links, we then find the distribution of the conditional success probability given the point process, which we call meta-distribution. It determines the fraction of links that achieve a target SIR with at least a certain probability.

In the second part, we turn to cellular networks, where users are connected to the base station providing the strongest signal while all others interfere. After reviewing known results for the case of Poisson distributed base stations, including multi-tier models, we will discuss an approximation technique called ASAPPP for general stationary base station point processes. ASAPPP stands for "Approximate SIR Analysis based on the Poisson point process ". It is based on the fact that the SIR distributions for different point process are essentially just horizontally shifted versions of each other. Remarkably, this gap between the distributions is almost constant over the entire distribution and barely depends on the fading statistics or the path loss exponents, which shows that it is mainly a geometric quantity. In the derivations, a new type of point process, the so-called relative distance process, proves useful.

The second part is based on joint work with Radha Krishna Ganti (IIT Madras).

Click here for the presentation of first session.

Click here for the presentation of second session.

**Robert Heath, University of Texas at Austin, USA**

### Stochastic geometric analysis of massive MIMO networks

Cellular communication systems have proven to be a fertile ground for the application of stochastic geometry. This talk considers performance analysis in cellular systems that employ what is known as massive multiple-input multiple-out (MIMO) communication. With massive MIMO, each base station exploits a large number of antennas and uses them to serve tens of users. Massive MIMO is being studied extensively as part of fifth generation cellular networks. This presentation applies stochastic geometry to analyze the system performance of massive MIMO networks in two different configurations: uplink (users talk to base stations) and downlink (base stations talk to users). In the uplink analysis, an exclusion ball model is proposed to incorporate the correlations between the locations of the scheduled users, and is shown to match the real user process with negligible error. Further, simple expressions for the distributions of the uplink signal-to-interference ratio (SIR) and rate are derived, and reveal the scaling law that to maintain the same aggregate uplink SINR distribution in the cell, the number of antennas should scale superlinearly with the number of scheduled users per cell, as a function of the path loss exponent. For the downlink, the asymptotic SINR and rate performance is investigated, when the number of antennas goes to infinity. The analysis shows that the downlink SIR coverage increases with larger path loss exponent. The results confirm the high cell throughput of massive MIMO systems in large-scale networks.

Click here for the presentation.

**Philippe Jacquet , Bell Labs**

### Capacity estimates of wireless networks in Poisson shot model over uniform or fractal Cantor maps

We want to estimate the capacity of wireless networks under several models and several geometric assumptions. In all models we consider that all nodes communicate with a single fixed access point. We first provide a Shannon capacity upper bound, second we consider the flat outage capacity, and finally an optimized outage capacity. The nodes are randomly distributed in an infinite fractal Cantor map embedded in a space of dimension D, a model which is more realistic than the classic piecemeal uniform map model. When the map is piecemeal uniform, the capacities are independent of the densities and of the fading distributions when the maps are uniform. These constants are function of the space dimension $D$ and of the signal attenuation factor. When the map is purely fractal then the constant still holds but with the fractal dimension d_F replacing D. However in this case the capacities show small periodic oscillations around these constants when the node density varies. The practical consequence of this result is that the capacity increases significantly when the network map has a small fractal dimension

Click here for the presentation.

**Mohamed Karray, Orange Labs, France**

### How user throughput depends on the traffic demand in large cellular networks

We assume a space-time Poisson process of call arrivals on the infinite plane, independently marked by data volumes and served by a cellular network modeled by an infinite ergodic point process of base stations. Each point of this point process represents the location of a base station that applies a processor sharing policy to serve users arriving in its vicinity, modeled by the Voronoi cell, possibly perturbed by some random signal propagation effects. User service rates depend on their signal-to-interference-and-noise ratios with respect to the serving station. Little's law allows to express the mean user throughput in any region of this network model as the ratio of the mean traffic demand to the steady-state mean number of users in this region.

Using ergodic arguments and the Palm theoretic formalism, we define a global mean user throughput in the cellular network and prove that it is equal to the ratio of mean traffic demand to the mean number of users in the steady state of the ""typical cell"" of the network. Here, both means account for double averaging: over time and network geometry, and can be related to the per-surface traffic demand, base-station density and the spatial distribution of the signal-to-interference-and-noise ratio. This latter accounts for network irregularities, shadowing and cell dependence via some cell-load equations.

Inspired by the analysis of the typical cell, we propose also a simpler, approximate, but fully analytic approach, called the mean cell approach. The key quantity explicitly calculated in this approach is the cell load. In analogy to the load factor of the (classical) M/G/1 processor sharing queue, it characterizes the stability condition, mean number of users and the mean user throughput. We validate our approach comparing analytical and simulation results for Poisson network model to real-network measurements.

Click here for the presentation.

**Wilfrid Kendall, University of Warwick, UK**

### Google maps and improper Poisson line processes

The theory of random lines has a celebrated history, reaching back 300 years into the past to the work of Buffon, and forming a major part of the field of stochastic geometry. Recently it has found application in the derivation of surprising non-stochastic results concerning effective planar networks [1]. I plan to present an account of this, accessible to non-specialists, and also to describe more recent work concerning flows in related networks [2,3,4] and to introduce a rather curious random metric space [5].References:

1. David J. Aldous, WSK. Short-length routes in low-cost networks via Poisson line patterns. Advances in Applied Probability 40 (2008), no. 1, 1-21.

2. WSK. Networks and Poisson line patterns: fluctuation asymptotics. Oberwolfach Reports (2008), no. 5, 2670-2672.

3. WSK. Geodesics and flows in a Poissonian city. Annals of Applied Probability 21 (2011), no. 3, 801-842.

4. WSK. Return to the Poissonian City. Journal of Applied Probability (2014), 15A, 297-309.

5. WSK (2014). From Random Lines to Metric Spaces. Annals of Probability, (to appear), 46pp.

Click here for the presentation.

**Ilia Molchanov, University of Bern, Switzerland**

### Random sets in economics, finance and insurance

Theory of random sets was greatly motivated by financial applications to economics in the path-breaking works of Aumann and Shapley 50 years ago that later on resulted in two Nobel prizes. While subsequently the theory of random sets was mainly applied in spatial statistics, networks, image analysis and material science, over the last ten years financial and econometrical applications came again to the frontline of the theory and greatly enriched it.

This mini-course aims to describe several modern directions of such applications with emphasis on statistical inference for partially identified models in econometrics, models with transaction costs in finance, and multivariate risk assessment covering both the multiasset case for a single agent and also a network of financial agents.

Click here for the presentation of first session.

Click here for the presentation of second session.

**Sayandev Mukerjee,Docomo Innovations**

### Stochastic Geometry and the User Experience in a Wireless Cellular Network

The last five years have seen a remarkable increase in our knowledge of the behavior of wireless cellular systems, aided primarily by the use of point process models for the locations of the base stations (access points) and users. In particular, when the locations of the base stations and users in a cellular network are modeled by the points of independent homogeneous Poisson point processes (PPPs), a nearly complete picture of the distribution of the signal-to-interference ratio at user terminals and base stations may be obtained analytically without approximation.

This talk is intended to provide an overview of how stochastic geometry can give us insights into the " user experience " in a cellular wireless network. The metrics we consider to quantify this " user experience " are throughput, bit rate, and spectral efficiency. Further, each of these metrics is studied from two viewpoints that are related but distinct. The first perspective is that of a subscriber to the cellular service, who is interested in the data rate he/she receives anywhere in the network (as opposed to the data rate enjoyed by any other user). The second perspective is that of the cellular network operator, who is interested in the distribution of data rate across all users located all over the network.

It is quite remarkable that, as we shall see, stochastic geometry provides the tools to analyze not only the association rules whereby a serving base station is chosen for a given user, but also the data rate to that user when the chosen serving base station transmits to this user, and the area spectral efficiency over all transmissions made by a " typical " base station anywhere in the network. In addition, we shall touch upon the ergodic rate on a link to a user and its similarity to, and difference from, area spectral efficiency. Lastly, we shall provide a brief overview of generalizations to multi-tier, so-called heterogeneous, cellular networks, and to transmissions involving multiple cooperative base stations.

Click here for the presentation.

**Volker Schmidt , University of Ulm, Germany**

### Mathematical tools for analysis, modeling and simulation of spatial networks on various length scales

Random point processes and random tessellations are fundamental classes of models in stochastic geometry. They can be used in order to describe structural properties of complex point patterns and cellular systems occurring in various disciplines and on various length scales, like telecommunication networks on geographic scales, and networks of particles, crystals and molecules in materials science on microscopic scales.

In this tutorial we discuss several examples of random point processes, including the homogeneous Poisson process, Poisson cluster processes, Poisson hard-core processes, and Cox processes. We show how these point-process models can be applied to describe the locations of objects in the plane and in higher-dimensional Euclidean spaces. Furthermore, we discuss examples of random tessellations, like Voronoi, Delaunay and Laguerre tessellations, which can used to describe, e.g., networks of communication paths and the boundaries of cells in cellular systems.

We also discuss the concept of Palm calculus for stationary point processes and stationary tessellations which can be used to justify algorithms for efficient (local) simulation of typical objects, e.g., the typical cell of a stationary tessellation.

Finally, we discuss several examples of structural characteristics of stochastic network models, and mathematical techniques in order to compute them, e.g., the distribution of typical shortest-path lengths, and other related structural characteristics, like connectivity, tortuosity and constrictivity of communication paths along the edges of stationary networks.

Click here for the presentation of first session.

Click here for the presentation of second session.

**Charlie Zhang, Samsung Telecommunications America, USA**

### Application of Stochastic Geometry in Modeling Future LTE-A and 5G Wireless Networks

As radio access technologies continue to evolve to support increasing traffic demand and new services, system models have been critical for research into performance trends which can lead to the development of new algorithms. However, it is often challenging to accurately incorporate many of the complexities of real-world deployments. In this talk we discuss the evolution of system models within the wireless communications industry and highlight the need of flexible analytical and simulation techniques such as stochastic geometry in the evaluation of emerging wireless communications technologies including HetNets, mmW cellular networks, and D2D. In particular, we highlight the use of stochastic geometry in throughput analysis for sensing-based spectrum sharing D2D networks. The spatial false alarm probability and detection probability are analyzed and a closed-form expression of the spectrum sensing threshold is derived. Assuming channel inversion is used, we analytically characterize the achievable rates for D2D users and cellular users. We find that in these hybrid sensing-based D2D and cellular networks, there exists an optimal sensing radius to maximize the proportional throughput of the system.

Co-authors are Thomas Novlan, Hao Cheng and Lingjia Liu.

Click here for the presentation.

**Sergei Zuyev, Chalmers University of Technology, Sweden**

### Generalised Eden growth model and random planar trees

In a classical Eden growth model, a crystal on a grid extends to nearby nodes with the rate proportional to the number of already crystallised neighbours. This model is equivalent to first-passage percolation with exponential passage time. An extension of this model is to allow a general weight for different number of crystallised neighbours. Classical subadditivity arguments allow establishing existence of a limiting shape for the most natural case of monotone rates: the more neighbours are crystallised, the sooner a boundary node crystallises too. However, a real challenge represent non-monotone rates where all the classical approaches to establishing the shape result fail. Computer simulations suggest that even in this case a limiting shape does exist. We present partial results for such models and discuss some of their intriguing features: existence of holes, the exponential bound of their size and possible long-range structural dependence.

Click here for the presentation.