Modeling Catastrophic Events in Spark - Databricks

Modeling Catastrophic Events in Spark

Download Slides

Reinsurance company’s core competencies include the quantification of risk associated with catastrophes, such as hurricanes and earthquakes. Various so-called catastrophe models are available publicly, some commercial and some open-source. The volume of data processed by such “cat models” requires Big Data and High Performance capabilities. This is clearly reflected in the landscape of public models. And the observed trend is towards more and more detailed inputs, as well as outputs. This makes scalability an important concern.
Companies that deal with catastrophe risk commonly use one or several public cat models. If they wish to differentiate themselves from the market, they may build internal proprietary models, in particular in areas that are not covered by existing models. The result is a deeper understanding and an independent quantification of risk, both of which can lead to a competitive edge.

About Georg Hofmann

Georg Hofmann is the principal data scientist at Validus Research. He maintains positions as adjunct faculty at the Department of Mathematics and Statistics of Dalhousie University, Nova Scotia, Canada and as a technical advisor of the Risk Analytics Lab of the Computer Science Department. He has worked in the reinsurance industry for 8 years, while continuously publishing research about algorithms related to the implementation of catastrophe models. He received a Mathematics Diploma (Masters) with a Minor in Quantum Mechanics from the Technische Universität Darmstadt (Germany). At the same university he received his Ph.D. in Mathematics.