The first British Seismology Meeting

The first British Seismology Meeting Kathrin Lieser, Elizabeth Entwistle, Jennifer Weston and Dmitry Storchak report from the inaugural British Seismology Meeting, which attracted more than 100 participants worldwide. In 2015, Elizabeth Entwistle and Jennifer Weston of the International Seismological Centre (ISC) realized that there had not been a wide-ranging cross-diciplinary seismological meeting in the UK since Frontiers of Seismology in Edinburgh in 2009 (A&G 2009 50 4.31). They approached ISC director Dmitry Storchak with the idea of restarting regular seismology meetings in the UK. The result was the British Seismology Meeting, the first full-scale scientific meeting organized by the ISC in its more than 50 year history. The first British Seismology Meeting (BSM) was held at Reading Museum and Town Hall over three days from 5 to 7 April 2017, with 44 talks (including six invited talks) and 43 posters presented in eight sessions. The meeting attracted scientists from the UK and abroad to present and discuss seismological research, and to strengthen contacts and establish new links within and between seismological communities and industries. The organizers hope that BSM, with the help of UK universities, will become a regular event. This article provides a brief overview of each session, representing only a snapshot of what was a full and diverse scientific meeting. The rise of induced seismicity Day one started with invited speaker Torsten Dahm (GFZ German Research Centre for Geosciences, Germany) on “The weal and woe of induced seismicity”. Large-scale and long-term operations, such as dams or hydrocarbon production, can induce earthquakes with magnitudes greater than 6. The damage to buildings and infrastructure caused by such earthquakes is an obvious woe of induced seismicity. The weal (at least for seismologists) is that it sets challenges for technical and methodical innovations (e.g. improving detection and automatic location methods) as well as providing opportunities to study the physics of earthquake triggering and rupture, such as stress shadow effects and control of maximum magnitude. Miles Wilson (Durham University) gave a global review of anthropogenically induced and naturally triggered earthquakes and presented a database, HiQuake, consisting of more than 700 events thought to be anthropogenically induced (figure 1). In recent years, reports of injection-induced earthquakes have increased. The largest induced earthquake in HiQuake is a Mw 7.9 event triggered by water impoundments in China in 2008; the majority of events in the database are of magnitudes 3–4. 1 View largeDownload slide Proportions of cases reported in the HiQuake database of induced seismicity. (http://www.inducedearthquakes.org/graphs) 1 View largeDownload slide Proportions of cases reported in the HiQuake database of induced seismicity. (http://www.inducedearthquakes.org/graphs) Subsequent speakers presented examples of induced seismicity. Antonio Villaseñor (Institute of Earth Sciences Jaume Almera, ICTJA-CSIC, Spain) reported on induced earthquakes off the coast of Spain where a seismic sequence, the CASTOR crisis, was triggered after the injection of gas into an underground store in September and October 2013. His findings indicate that the three largest events, of about magnitude 4, originated deeper than the injection depth. He and co-workers proposed that the reactivation of an unmapped NW–SE fault in the basement was an explanation. In contrast to the CASTOR crisis, Ian Main (University of Edinburgh) discussed induced seismicity at the Hot Dry Rock test site for geothermal energy production in Cornwall, where the largest event is only Mw 0.5. Only a tiny fraction of the total available strain is released seismically in this highly compliant reservoir, while aseismic slip occurs on optimally oriented faults. Staying in the UK, and moving to smaller magnitudes, Anna Horleston (University of Bristol) investigated anthropogenic noise recorded on temporary seismic networks. Prior to any industrial activity in the UK, at least one year of monitoring the natural seismicity level is required, ideally with magnitude of completeness of Ml = 0, which is a challenge to instrumentation and monitoring methods. Two dense temporary networks were deployed in Lancashire and Yorkshire and the seismic noise was analysed to study the effects of site location, burial material and deployment method. She found that the distance between a station and the source of seismic noise has a greater effect on data quality than the deployment method. Testing for efficacy after the installation of monitoring networks is necessary to optimize array design. Moving on to magnitudes, Richard Luckett (British Geological Survey) investigated inconsistencies between station magnitudes measured on near and far stations. As very small earthquakes are detectable only by close stations, the problem arises of how to estimate magnitudes if you cannot trust readings from those stations. To solve this problem, he added an extra, region-dependent, exponential term to the magnitude equation that results in near-source magnitudes agreeing with those at other stations but not changing the vast majority of existing catalogue magnitudes. Investigating unusual signals The final session of day one focused on unusual signals, most of which arose from nuclear explosions. Anton Ziolkowski (University of Edinburgh) explained how he determined the yields and source time functions of announced North Korean (DPRK) nuclear tests. Here, estimating yields is problematic because, unlike for other nuclear test sites, the parameters for the conventional approach relating yields to body-wave magnitude Mb are not known. His method requires two explosions at the same location. Then, the Earth impulse response between the two sources and a given seismometer is the same and the source time functions are scaled, with the scaling factor being the cube root of the energy ratio. Computing the spectral ratio of the source time functions and applying the scaling law allows yields to be estimated. Invited speaker Steven Gibbons (NORSAR, Norway) used the nuclear explosions at the DPRK test site as an example of how to formulate the relative location problem as a source-array problem. In his talk, Gibbons focused on accurate seismic event location and showed examples where earthquake location can be challenging, e.g. at the Arctic Mid-Atlantic Ridge (only teleseismic phases but good azimuthal coverage) or the Kashmir earthquake aftershocks (few stations at regional distances and poor azimuthal coverage). His approach to these problems is to consider multiple seismic events together to mitigate the effects of deficiencies in the velocity model. Using probability-based multiple event location algorithms (Bayesloc), good estimates of the uncertainty and bias of travel times along individual paths can be made, even when estimates of the full velocity structure are difficult or impossible. Then, for regions with significant seismicity, knowledge of the velocity model does not need to be perfect in order to improve seismic event locations significantly. The ISC's Kostas Lentas showed how the IASPEI Ground Truth (GT) Reference Event List for monitoring purposes is assembled. Events are divided into categories (GT0, GT1, GT2 and GT5), where the epicentre of a GTx event is known within x km to a 95% confidence level. The best determined GT events (GT0–2) originate from nuclear or chemical explosions, while GT5 events also include earthquakes. At the end of the session we moved on to Mars. Jennifer Stevanovic (AWE Blacknest) explained how bolide airbursts will function as a seismic source for the 2018 Mars InSight mission, which will place a single geophysical lander, including a seismometer, on Mars. About 10–200 detectable events per year are expected for InSight. To identify diagnostic airburst characteristics in the time and frequency domain, she showed how analogue terrestrial data are used (figure 2). 2 View largeDownload slide Rare examples of possible surface evidence for airbursts on Mars. (From Stevanović J et al. 2017 Space Science Reviews 1–21) 2 View largeDownload slide Rare examples of possible surface evidence for airbursts on Mars. (From Stevanović J et al. 2017 Space Science Reviews 1–21) Passive sources and sensitive signals Day two of the conference began with invited speaker Eleonore Stutzmann (Institut de Physique du Globe de Paris, France) investigating frequency-dependent microseism sources. The source of seismic signals with periods about 3–300 s can be explained by two mechanisms in the ocean: wave–sea floor interactions at the coast, and wave–wave interactions. Stutzmann analysed secondary microseisms of 3–10 s generated by ocean wave–wave interactions that were recorded by several networks. Using a beamforming approach, the source locations of these microseisms were found to be generated by typhoons, ocean waves reflected at coasts, and icebergs. Anna Stork (University of Bristol) wanted to determine the usefulness of seismic monitoring in the event of a leak from a CO2 storage site. Stork's test site is a dense geophone array at the first commercial carbon capture and storage site in Saskatchewan, Canada. To assess the likelihood of induced seismicity in the case of a CO2 leak, she modelled fluid flow along a vertical fault. She found that the fracture pressure will be exceeded – potentially inducing seismic events – when the CO2 reaches depths of less than 500 m. Ambient noise interferometry (ANI) and tomographic inversion methods did not detect seismic velocity changes associated with CO2 leakage at the site at this time. But, with more data and a larger array, ANI could provide cost-effective, near real-time monitoring. Invited speaker Tom Mitchell (University College London) kicked off the second session talking about the mysteries of fault behaviour, analysing earthquake fracture damage and healing in the seismic cycle. Damaged rocks develop different wave and rupture propagation patterns and, because of their increased permeability, play a key role in fluid migration in and around fault zones over the seismic cycle. However, little is known about the properties of damage zones and their evolution in the seismic cycle. Mitchell shared lab and field examples of proposed generation mechanisms of pulverized rocks: intensely damaged fault rocks that have undergone minimal shear strain. Crustal velocity reductions at depth after large earthquakes suggest co-seismic fracture damage, while a time-dependent increase in seismic velocities after the damage, lasting from days to years, implies healing processes such as cracks closing, filling with fluids or being sealed by minerals. Little is known about controls on healing rates, but lab experiments hint that fractures heal faster at elevated temperatures. Moving to a smaller scale, Christopher Harbord (Durham University) investigated frictional instabilities of rough faults by creating home-made faults in the lab. He vertically cuts Westerly granite, polishes the fracture plane to a fine (< 1 μm roughness) finish and applies a range of grits. He then performs a series of velocity step friction experiments using a triaxial deformation apparatus. The pattern of stability transitions suggests that bare rough faults seem to require a different stability criterion compared to the rest, dependent on weak patch scaling and nucleation length. Looking at the rupture process in more detail, Jessica Hawthorne (University of Leeds) asked whether standard rupture models and speeds are appropriate for low-frequency earthquakes, as is the case in tremors. In an analysis of many small low-frequency earthquakes near Parkfield, California, the rupture extents were 2 to 4 times smaller than expected for their duration with a standard model. Low-frequency earthquakes may have different rupture dynamics, or attenuation may be a factor. How hazardous? The next session focused on seismic hazard risk in regions showing low to moderate seismicity. Roland Roberts (Uppsala University, Sweden) reported on the Swedish National Seismic Network (SNSN) and why Sweden, despite being in a stable continental area, invests in seismic monitoring. Although the seismic risk in Fennoscandia is low, it is not negligible. Sweden experienced large earthquakes at the end of the last ice age and it is not clear if such events can occur in an interglacial period. Furthermore, monitoring seismicity is important for the nuclear industry and for assessing hazards such as triggered landslides and damaged dams. A problem in areas with low to moderate seismicity is that standard hazard assessment methods are misleading because there is too little, too low-quality input data. Probabilistic seismic hazard assessment (PSHA) could not be successfully applied in Sweden. Gloria Senfaute and Emmanuel Viallet (both Electricité de France, EDF, France) discussed ways of tackling the shortcomings of PSHA for regions with low to moderate seismicity. This included the introduction of historical data and Bayesian approaches to reduce epistemic uncertainties, such as the assessment of the maximum earthquake magnitude, in PSHA calculations for France. The session ended with ISC's Domenico Di Giacomo giving an overview of the ISC-GEM catalogue for global seismic hazard purposes. The first release of the ISC-GEM Global Instrumental Earthquake Catalogue covered the period 1900–2009 and aimed to substantially extend, improve and homogenize existing bulletin data for large global earthquakes (magnitude 5.5 and above) to provide basic earthquake parameters for users who assess and model seismic hazard and risk. In 2013, a four-year extension project was started to include global earthquakes (magnitude 5.5 and above) that occurred after 2009 and smaller earthquakes (between 5.5 and 6.2–6.3) from 1904 to 1959. Catalogues: power and completeness Following a busy poster session, Yuzo Ishikawa (AIST, Geological Survey of Japan) started the last session of day two by explaining the difficulties of hypocentre determination in the early 20th century. A major problem was that stations often had different clocks, manually corrected once a day, making for arrival times that were not very reliable. Nevertheless, one can use the more reliable S–P times from historical data. Another problem was that the magnification of old seismograms was often very low and arrival times were often misread. In the discussion after the talk, Robin Adams gave two examples of earthquakes that were grossly mislocated because: (1) surface waves were picked instead of body wave phases, and (2) P'2 phases from a source in the Pacific were misinterpreted as P phases, erroneously creating a M5 event in Cameroon! Marleine Brax (National Council for Scientific Research, Lebanon) presented an initial version of a seismic catalogue for seismic hazard assessment in Lebanon from historical and modern instrumental data. The historical data, covering about 2000 years, had to be reviewed carefully. Some events turned out to be two events that were several years apart, because of different timing methods in the underlying studies. Also, some studies mixed up city names and reported earthquakes that were actually located in different areas. Furthermore, magnitudes from the recent instrumental catalogue and the historical catalogue were found to be inconsistent. Given these factors, the seismic record might not be representative of the seismic potential in Lebanon. Thus, PSHA might be underestimating the seismic potential, and fault modelling might be a more effective approach for estimating seismic hazards. Pierre Arroucau(Dublin Institute for Advanced Studies) demonstrated that Ireland's seismicity is not as elusive as typically thought. It is characterized by a few low-magnitude events, as reflected in the seismic hazard map for Europe (figure 3). Arroucau proudly presented 218 new events from 2010–16 that were detected using waveform cross-correlation. This method is limited to repeating events, thus there might be even more seismicity in Ireland waiting to be detected. 3 View largeDownload slide The European Seismic Hazard Map displays the ground shaking (i.e. peak horizontal ground acceleration) to be reached or exceeded with a 10% probability in 50 years. (© 2013 ETH Zurich on behalf of EU-FP7 Consortium of SHARE. D Giardini, J Woessner, L Danciu, H Crowley, F Cotton, G Grünthal, R Pinho, G Valensise and SHARE consortium) 3 View largeDownload slide The European Seismic Hazard Map displays the ground shaking (i.e. peak horizontal ground acceleration) to be reached or exceeded with a 10% probability in 50 years. (© 2013 ETH Zurich on behalf of EU-FP7 Consortium of SHARE. D Giardini, J Woessner, L Danciu, H Crowley, F Cotton, G Grünthal, R Pinho, G Valensise and SHARE consortium) To finish the second day, Dmitry Storchak (ISC) gave an overview of ISC datasets and services. The ISC's main mission is to provide the definitive parametric information on recent and past earthquakes and other seismic events. The ISC collects and integrates seismic bulletin data from about 140 networks worldwide, making the ISC Bulletin the most complete global long-term source of seismic bulletin information. Imaging the Earth at all scales The first session of the final day saw images of the Earth at all scales from crust to core. Keith Priestley (University of Cambridge) used receiver functions and high-frequency surface wave analysis to investigate the lithospheric structure across the Indo-Eurasian belt. He found that radioactive heating of the Tibetan crust resulted in a crustal temperature inversion and lowered the velocities of the upper mantle beneath northern Tibet. Despite being responsible for one of the largest volcanic eruptions in history, little is known about Mt Paektu/Changbaishan because it is located on the border between the DPRK and China (figure 4) making it difficult to carry out comprehensive studies. James Hammond (Birkbeck University London) was one of the first western scientists invited to the DPRK to study the volcano, which had shown recent signs of unrest (deformation, increase in seismicity and volcanic gases); he deployed the first broadband seismometers in the DPRK. Until that time, data had been collected on the Chinese site only. Receiver function analysis from the DPRK site reveals high Vp/Vs ratios and indicates the presence of significant melts beneath the volcano. Future work will involve combining the Chinese and DPRK data sets. 4 View largeDownload slide Mt Paektu/Changbaishan at the border of North Korea and China. (Google Maps) 4 View largeDownload slide Mt Paektu/Changbaishan at the border of North Korea and China. (Google Maps) Invited speaker Andy Nowacki (University of Leeds) investigated two large low-shear velocity provinces (LLSVPs) at the core–mantle boundary (CMB). These provinces show reductions in shear wave velocity of ∼3% and are located antipodally beneath the Pacific Ocean and continental Africa. Little is known of their origin (thermal or chemical?) and structure, because tomographic studies cannot resolve them sufficiently. To get more information on the structures, Nowacki studied deviations of P and Pdiff phases along their travel paths from Pacific earthquakes to the Yellowknife Array in Canada, providing good coverage of the edge of the Pacific LLSVP. Distributions of backazimuthal deviations and depths of turning points suggest a pipe-like structure of low P wave velocity that is about 200 km in diameter and rising at least 300 km above the CMB northwest of Hawaii. Such a structure might be connected with the Hawaiian hot spot. Dealing with big data Invited speaker Tarje Nissen-Meyer (University of Oxford) reflected upon big data in the context of Occam's razor: a principle of parsimony stating that a simpler, less complicated solution should be preferred over a more complicated one if they lead to similar results. It is a challenge to manage the computational costs (including energy efforts) for calculating wave propagation through complex 3D crustal structure while maintaining sufficiently accurate wave physics, for example. The choices, compromises and approximations necessary to make computations feasible have to be considered carefully. Does the output justify a complex model, or can similar results be accomplished by simpler models with lower computing costs? It is challenging to find the proper balance. Kuangdai Leng (University of Oxford) presented results from modelling scheme AxiSEM(3D), which tries to address some of these issues. It is a spectral element method that computes 3D global seismic wave fields for realistic earthquake sources using asymmetric background models, developed by Tarje and others, to implement the crust in 3D models. This is computationally challenging because the crust's thickness varies drastically between oceanic and continental regions. The results of AxiSEM are comparable (or better) to those from full SEM, but up to twice as fast. Natalia Poiata (National Institute for Earth Physics, Romania) presented BackTrackBB, a fully automated detection and location method that aims to deal with increasing volumes of seismological data. It is an array-based automatic method that back-tracks the broadband signal to the origin, and is applicable to dense networks and continuous data. Importantly, it does not require preliminary information about seismic sources. After calculating the signal's characteristic function, earthquake detection and location is accomplished by back-projecting station-pair time-delay estimates according to theoretical time-delays across the network. The method was successfully applied to low-frequency earthquakes in Japan and a seismic swarm in Romania, and could detect events hidden in seismic coda, noise or in emergent onsets. Earthquake source parameters, such as depth and the source time function (STF), are necessary for global waveform modelling, but determining STFs manually is time consuming. Karin Sigloch (University of Oxford) presented a fully probabilistic seismic source inversion that makes the determination of source parameters more efficient by decomposing 900 robust, manually deconvolved STFs into empirical orthogonal functions (EOF). Approximating the posterior distribution of the STF by weighting the first 4–8 EOFs yields an ensemble of solutions that fit the observed waveform well, allowing the user to draw conclusions on the source time function, the source mechanism and source depth. Rather than choosing one best solution, the uncertainties of the STFs are propagated into travel time delay used in tomography. The ISC's Jennifer Weston finished the last session of the meeting by explaining how the EHB catalogue is being reconstructed and extended to create the ISC-EHB catalogue. The EHB catalogue consists of teleseismically well-constrained events selected from the ISC Bulletin that are widely used for tomographic inversions. The EHB catalogue ends in 2008 and the aim of the ISC-EHB is to extend the catalogue as well as to recreate it with new and more rigorous procedures for event selection, data preparation, processing and relocation. Closing remarks Alongside the busy speaker schedule, an active and diverse poster session provided the opportunity for many discussions. Furthermore, poster presentations from early-career researchers were judged: the two prizes sponsored by AWE (who also sponsored eight student conference grants) were awarded during the conference dinner, to Emily Crowder (University of Aberdeen) for her presentation of how to use dispersion analysis of ambient seismic noise for hydrocarbon exploration, and to Jennifer Jenkins (University of Cambridge) for her presentation on constraining Icelandic crustal structure using receiver function analysis. Overall, there was great support from the seismology community and feedback showed clear demand for another event. At the end of the conference, Anton Ziolkowski offered the University of Edinburgh as a potential next venue for BSM 2019 – we hope to see you there! ACKNOWLEDGMENTS We would like to thank the conference sponsors for their support: British Geophysical Association (BGA), AWE, Güralp Systems Ltd and Optics11. We are also grateful to the staff of Reading Town Hall and Museum, Jane Austen and Keith Andrews, and Sheila Peacock of AWE for her help and advice in organizing and planning this meeting. The organizer of the meeting, the ISC, is supported by 65 member institutions in 49 countries, including the National Science Foundation (Award 1417970) and several sponsors from both public and commercial sectors. FURTHER INFORMATION The BSM 2017 website holds talks and abstracts: http://bsm2017.isc.ac.uk © 2018 Royal Astronomical Society http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Astronomy & Geophysics Oxford University Press

Loading next page...
 
/lp/ou_press/the-first-british-seismology-meeting-hIKQwSDTcu
Publisher
Oxford University Press
Copyright
© 2018 Royal Astronomical Society
ISSN
1366-8781
eISSN
1468-4004
D.O.I.
10.1093/astrogeo/aty031
Publisher site
See Article on Publisher Site

Abstract

Kathrin Lieser, Elizabeth Entwistle, Jennifer Weston and Dmitry Storchak report from the inaugural British Seismology Meeting, which attracted more than 100 participants worldwide. In 2015, Elizabeth Entwistle and Jennifer Weston of the International Seismological Centre (ISC) realized that there had not been a wide-ranging cross-diciplinary seismological meeting in the UK since Frontiers of Seismology in Edinburgh in 2009 (A&G 2009 50 4.31). They approached ISC director Dmitry Storchak with the idea of restarting regular seismology meetings in the UK. The result was the British Seismology Meeting, the first full-scale scientific meeting organized by the ISC in its more than 50 year history. The first British Seismology Meeting (BSM) was held at Reading Museum and Town Hall over three days from 5 to 7 April 2017, with 44 talks (including six invited talks) and 43 posters presented in eight sessions. The meeting attracted scientists from the UK and abroad to present and discuss seismological research, and to strengthen contacts and establish new links within and between seismological communities and industries. The organizers hope that BSM, with the help of UK universities, will become a regular event. This article provides a brief overview of each session, representing only a snapshot of what was a full and diverse scientific meeting. The rise of induced seismicity Day one started with invited speaker Torsten Dahm (GFZ German Research Centre for Geosciences, Germany) on “The weal and woe of induced seismicity”. Large-scale and long-term operations, such as dams or hydrocarbon production, can induce earthquakes with magnitudes greater than 6. The damage to buildings and infrastructure caused by such earthquakes is an obvious woe of induced seismicity. The weal (at least for seismologists) is that it sets challenges for technical and methodical innovations (e.g. improving detection and automatic location methods) as well as providing opportunities to study the physics of earthquake triggering and rupture, such as stress shadow effects and control of maximum magnitude. Miles Wilson (Durham University) gave a global review of anthropogenically induced and naturally triggered earthquakes and presented a database, HiQuake, consisting of more than 700 events thought to be anthropogenically induced (figure 1). In recent years, reports of injection-induced earthquakes have increased. The largest induced earthquake in HiQuake is a Mw 7.9 event triggered by water impoundments in China in 2008; the majority of events in the database are of magnitudes 3–4. 1 View largeDownload slide Proportions of cases reported in the HiQuake database of induced seismicity. (http://www.inducedearthquakes.org/graphs) 1 View largeDownload slide Proportions of cases reported in the HiQuake database of induced seismicity. (http://www.inducedearthquakes.org/graphs) Subsequent speakers presented examples of induced seismicity. Antonio Villaseñor (Institute of Earth Sciences Jaume Almera, ICTJA-CSIC, Spain) reported on induced earthquakes off the coast of Spain where a seismic sequence, the CASTOR crisis, was triggered after the injection of gas into an underground store in September and October 2013. His findings indicate that the three largest events, of about magnitude 4, originated deeper than the injection depth. He and co-workers proposed that the reactivation of an unmapped NW–SE fault in the basement was an explanation. In contrast to the CASTOR crisis, Ian Main (University of Edinburgh) discussed induced seismicity at the Hot Dry Rock test site for geothermal energy production in Cornwall, where the largest event is only Mw 0.5. Only a tiny fraction of the total available strain is released seismically in this highly compliant reservoir, while aseismic slip occurs on optimally oriented faults. Staying in the UK, and moving to smaller magnitudes, Anna Horleston (University of Bristol) investigated anthropogenic noise recorded on temporary seismic networks. Prior to any industrial activity in the UK, at least one year of monitoring the natural seismicity level is required, ideally with magnitude of completeness of Ml = 0, which is a challenge to instrumentation and monitoring methods. Two dense temporary networks were deployed in Lancashire and Yorkshire and the seismic noise was analysed to study the effects of site location, burial material and deployment method. She found that the distance between a station and the source of seismic noise has a greater effect on data quality than the deployment method. Testing for efficacy after the installation of monitoring networks is necessary to optimize array design. Moving on to magnitudes, Richard Luckett (British Geological Survey) investigated inconsistencies between station magnitudes measured on near and far stations. As very small earthquakes are detectable only by close stations, the problem arises of how to estimate magnitudes if you cannot trust readings from those stations. To solve this problem, he added an extra, region-dependent, exponential term to the magnitude equation that results in near-source magnitudes agreeing with those at other stations but not changing the vast majority of existing catalogue magnitudes. Investigating unusual signals The final session of day one focused on unusual signals, most of which arose from nuclear explosions. Anton Ziolkowski (University of Edinburgh) explained how he determined the yields and source time functions of announced North Korean (DPRK) nuclear tests. Here, estimating yields is problematic because, unlike for other nuclear test sites, the parameters for the conventional approach relating yields to body-wave magnitude Mb are not known. His method requires two explosions at the same location. Then, the Earth impulse response between the two sources and a given seismometer is the same and the source time functions are scaled, with the scaling factor being the cube root of the energy ratio. Computing the spectral ratio of the source time functions and applying the scaling law allows yields to be estimated. Invited speaker Steven Gibbons (NORSAR, Norway) used the nuclear explosions at the DPRK test site as an example of how to formulate the relative location problem as a source-array problem. In his talk, Gibbons focused on accurate seismic event location and showed examples where earthquake location can be challenging, e.g. at the Arctic Mid-Atlantic Ridge (only teleseismic phases but good azimuthal coverage) or the Kashmir earthquake aftershocks (few stations at regional distances and poor azimuthal coverage). His approach to these problems is to consider multiple seismic events together to mitigate the effects of deficiencies in the velocity model. Using probability-based multiple event location algorithms (Bayesloc), good estimates of the uncertainty and bias of travel times along individual paths can be made, even when estimates of the full velocity structure are difficult or impossible. Then, for regions with significant seismicity, knowledge of the velocity model does not need to be perfect in order to improve seismic event locations significantly. The ISC's Kostas Lentas showed how the IASPEI Ground Truth (GT) Reference Event List for monitoring purposes is assembled. Events are divided into categories (GT0, GT1, GT2 and GT5), where the epicentre of a GTx event is known within x km to a 95% confidence level. The best determined GT events (GT0–2) originate from nuclear or chemical explosions, while GT5 events also include earthquakes. At the end of the session we moved on to Mars. Jennifer Stevanovic (AWE Blacknest) explained how bolide airbursts will function as a seismic source for the 2018 Mars InSight mission, which will place a single geophysical lander, including a seismometer, on Mars. About 10–200 detectable events per year are expected for InSight. To identify diagnostic airburst characteristics in the time and frequency domain, she showed how analogue terrestrial data are used (figure 2). 2 View largeDownload slide Rare examples of possible surface evidence for airbursts on Mars. (From Stevanović J et al. 2017 Space Science Reviews 1–21) 2 View largeDownload slide Rare examples of possible surface evidence for airbursts on Mars. (From Stevanović J et al. 2017 Space Science Reviews 1–21) Passive sources and sensitive signals Day two of the conference began with invited speaker Eleonore Stutzmann (Institut de Physique du Globe de Paris, France) investigating frequency-dependent microseism sources. The source of seismic signals with periods about 3–300 s can be explained by two mechanisms in the ocean: wave–sea floor interactions at the coast, and wave–wave interactions. Stutzmann analysed secondary microseisms of 3–10 s generated by ocean wave–wave interactions that were recorded by several networks. Using a beamforming approach, the source locations of these microseisms were found to be generated by typhoons, ocean waves reflected at coasts, and icebergs. Anna Stork (University of Bristol) wanted to determine the usefulness of seismic monitoring in the event of a leak from a CO2 storage site. Stork's test site is a dense geophone array at the first commercial carbon capture and storage site in Saskatchewan, Canada. To assess the likelihood of induced seismicity in the case of a CO2 leak, she modelled fluid flow along a vertical fault. She found that the fracture pressure will be exceeded – potentially inducing seismic events – when the CO2 reaches depths of less than 500 m. Ambient noise interferometry (ANI) and tomographic inversion methods did not detect seismic velocity changes associated with CO2 leakage at the site at this time. But, with more data and a larger array, ANI could provide cost-effective, near real-time monitoring. Invited speaker Tom Mitchell (University College London) kicked off the second session talking about the mysteries of fault behaviour, analysing earthquake fracture damage and healing in the seismic cycle. Damaged rocks develop different wave and rupture propagation patterns and, because of their increased permeability, play a key role in fluid migration in and around fault zones over the seismic cycle. However, little is known about the properties of damage zones and their evolution in the seismic cycle. Mitchell shared lab and field examples of proposed generation mechanisms of pulverized rocks: intensely damaged fault rocks that have undergone minimal shear strain. Crustal velocity reductions at depth after large earthquakes suggest co-seismic fracture damage, while a time-dependent increase in seismic velocities after the damage, lasting from days to years, implies healing processes such as cracks closing, filling with fluids or being sealed by minerals. Little is known about controls on healing rates, but lab experiments hint that fractures heal faster at elevated temperatures. Moving to a smaller scale, Christopher Harbord (Durham University) investigated frictional instabilities of rough faults by creating home-made faults in the lab. He vertically cuts Westerly granite, polishes the fracture plane to a fine (< 1 μm roughness) finish and applies a range of grits. He then performs a series of velocity step friction experiments using a triaxial deformation apparatus. The pattern of stability transitions suggests that bare rough faults seem to require a different stability criterion compared to the rest, dependent on weak patch scaling and nucleation length. Looking at the rupture process in more detail, Jessica Hawthorne (University of Leeds) asked whether standard rupture models and speeds are appropriate for low-frequency earthquakes, as is the case in tremors. In an analysis of many small low-frequency earthquakes near Parkfield, California, the rupture extents were 2 to 4 times smaller than expected for their duration with a standard model. Low-frequency earthquakes may have different rupture dynamics, or attenuation may be a factor. How hazardous? The next session focused on seismic hazard risk in regions showing low to moderate seismicity. Roland Roberts (Uppsala University, Sweden) reported on the Swedish National Seismic Network (SNSN) and why Sweden, despite being in a stable continental area, invests in seismic monitoring. Although the seismic risk in Fennoscandia is low, it is not negligible. Sweden experienced large earthquakes at the end of the last ice age and it is not clear if such events can occur in an interglacial period. Furthermore, monitoring seismicity is important for the nuclear industry and for assessing hazards such as triggered landslides and damaged dams. A problem in areas with low to moderate seismicity is that standard hazard assessment methods are misleading because there is too little, too low-quality input data. Probabilistic seismic hazard assessment (PSHA) could not be successfully applied in Sweden. Gloria Senfaute and Emmanuel Viallet (both Electricité de France, EDF, France) discussed ways of tackling the shortcomings of PSHA for regions with low to moderate seismicity. This included the introduction of historical data and Bayesian approaches to reduce epistemic uncertainties, such as the assessment of the maximum earthquake magnitude, in PSHA calculations for France. The session ended with ISC's Domenico Di Giacomo giving an overview of the ISC-GEM catalogue for global seismic hazard purposes. The first release of the ISC-GEM Global Instrumental Earthquake Catalogue covered the period 1900–2009 and aimed to substantially extend, improve and homogenize existing bulletin data for large global earthquakes (magnitude 5.5 and above) to provide basic earthquake parameters for users who assess and model seismic hazard and risk. In 2013, a four-year extension project was started to include global earthquakes (magnitude 5.5 and above) that occurred after 2009 and smaller earthquakes (between 5.5 and 6.2–6.3) from 1904 to 1959. Catalogues: power and completeness Following a busy poster session, Yuzo Ishikawa (AIST, Geological Survey of Japan) started the last session of day two by explaining the difficulties of hypocentre determination in the early 20th century. A major problem was that stations often had different clocks, manually corrected once a day, making for arrival times that were not very reliable. Nevertheless, one can use the more reliable S–P times from historical data. Another problem was that the magnification of old seismograms was often very low and arrival times were often misread. In the discussion after the talk, Robin Adams gave two examples of earthquakes that were grossly mislocated because: (1) surface waves were picked instead of body wave phases, and (2) P'2 phases from a source in the Pacific were misinterpreted as P phases, erroneously creating a M5 event in Cameroon! Marleine Brax (National Council for Scientific Research, Lebanon) presented an initial version of a seismic catalogue for seismic hazard assessment in Lebanon from historical and modern instrumental data. The historical data, covering about 2000 years, had to be reviewed carefully. Some events turned out to be two events that were several years apart, because of different timing methods in the underlying studies. Also, some studies mixed up city names and reported earthquakes that were actually located in different areas. Furthermore, magnitudes from the recent instrumental catalogue and the historical catalogue were found to be inconsistent. Given these factors, the seismic record might not be representative of the seismic potential in Lebanon. Thus, PSHA might be underestimating the seismic potential, and fault modelling might be a more effective approach for estimating seismic hazards. Pierre Arroucau(Dublin Institute for Advanced Studies) demonstrated that Ireland's seismicity is not as elusive as typically thought. It is characterized by a few low-magnitude events, as reflected in the seismic hazard map for Europe (figure 3). Arroucau proudly presented 218 new events from 2010–16 that were detected using waveform cross-correlation. This method is limited to repeating events, thus there might be even more seismicity in Ireland waiting to be detected. 3 View largeDownload slide The European Seismic Hazard Map displays the ground shaking (i.e. peak horizontal ground acceleration) to be reached or exceeded with a 10% probability in 50 years. (© 2013 ETH Zurich on behalf of EU-FP7 Consortium of SHARE. D Giardini, J Woessner, L Danciu, H Crowley, F Cotton, G Grünthal, R Pinho, G Valensise and SHARE consortium) 3 View largeDownload slide The European Seismic Hazard Map displays the ground shaking (i.e. peak horizontal ground acceleration) to be reached or exceeded with a 10% probability in 50 years. (© 2013 ETH Zurich on behalf of EU-FP7 Consortium of SHARE. D Giardini, J Woessner, L Danciu, H Crowley, F Cotton, G Grünthal, R Pinho, G Valensise and SHARE consortium) To finish the second day, Dmitry Storchak (ISC) gave an overview of ISC datasets and services. The ISC's main mission is to provide the definitive parametric information on recent and past earthquakes and other seismic events. The ISC collects and integrates seismic bulletin data from about 140 networks worldwide, making the ISC Bulletin the most complete global long-term source of seismic bulletin information. Imaging the Earth at all scales The first session of the final day saw images of the Earth at all scales from crust to core. Keith Priestley (University of Cambridge) used receiver functions and high-frequency surface wave analysis to investigate the lithospheric structure across the Indo-Eurasian belt. He found that radioactive heating of the Tibetan crust resulted in a crustal temperature inversion and lowered the velocities of the upper mantle beneath northern Tibet. Despite being responsible for one of the largest volcanic eruptions in history, little is known about Mt Paektu/Changbaishan because it is located on the border between the DPRK and China (figure 4) making it difficult to carry out comprehensive studies. James Hammond (Birkbeck University London) was one of the first western scientists invited to the DPRK to study the volcano, which had shown recent signs of unrest (deformation, increase in seismicity and volcanic gases); he deployed the first broadband seismometers in the DPRK. Until that time, data had been collected on the Chinese site only. Receiver function analysis from the DPRK site reveals high Vp/Vs ratios and indicates the presence of significant melts beneath the volcano. Future work will involve combining the Chinese and DPRK data sets. 4 View largeDownload slide Mt Paektu/Changbaishan at the border of North Korea and China. (Google Maps) 4 View largeDownload slide Mt Paektu/Changbaishan at the border of North Korea and China. (Google Maps) Invited speaker Andy Nowacki (University of Leeds) investigated two large low-shear velocity provinces (LLSVPs) at the core–mantle boundary (CMB). These provinces show reductions in shear wave velocity of ∼3% and are located antipodally beneath the Pacific Ocean and continental Africa. Little is known of their origin (thermal or chemical?) and structure, because tomographic studies cannot resolve them sufficiently. To get more information on the structures, Nowacki studied deviations of P and Pdiff phases along their travel paths from Pacific earthquakes to the Yellowknife Array in Canada, providing good coverage of the edge of the Pacific LLSVP. Distributions of backazimuthal deviations and depths of turning points suggest a pipe-like structure of low P wave velocity that is about 200 km in diameter and rising at least 300 km above the CMB northwest of Hawaii. Such a structure might be connected with the Hawaiian hot spot. Dealing with big data Invited speaker Tarje Nissen-Meyer (University of Oxford) reflected upon big data in the context of Occam's razor: a principle of parsimony stating that a simpler, less complicated solution should be preferred over a more complicated one if they lead to similar results. It is a challenge to manage the computational costs (including energy efforts) for calculating wave propagation through complex 3D crustal structure while maintaining sufficiently accurate wave physics, for example. The choices, compromises and approximations necessary to make computations feasible have to be considered carefully. Does the output justify a complex model, or can similar results be accomplished by simpler models with lower computing costs? It is challenging to find the proper balance. Kuangdai Leng (University of Oxford) presented results from modelling scheme AxiSEM(3D), which tries to address some of these issues. It is a spectral element method that computes 3D global seismic wave fields for realistic earthquake sources using asymmetric background models, developed by Tarje and others, to implement the crust in 3D models. This is computationally challenging because the crust's thickness varies drastically between oceanic and continental regions. The results of AxiSEM are comparable (or better) to those from full SEM, but up to twice as fast. Natalia Poiata (National Institute for Earth Physics, Romania) presented BackTrackBB, a fully automated detection and location method that aims to deal with increasing volumes of seismological data. It is an array-based automatic method that back-tracks the broadband signal to the origin, and is applicable to dense networks and continuous data. Importantly, it does not require preliminary information about seismic sources. After calculating the signal's characteristic function, earthquake detection and location is accomplished by back-projecting station-pair time-delay estimates according to theoretical time-delays across the network. The method was successfully applied to low-frequency earthquakes in Japan and a seismic swarm in Romania, and could detect events hidden in seismic coda, noise or in emergent onsets. Earthquake source parameters, such as depth and the source time function (STF), are necessary for global waveform modelling, but determining STFs manually is time consuming. Karin Sigloch (University of Oxford) presented a fully probabilistic seismic source inversion that makes the determination of source parameters more efficient by decomposing 900 robust, manually deconvolved STFs into empirical orthogonal functions (EOF). Approximating the posterior distribution of the STF by weighting the first 4–8 EOFs yields an ensemble of solutions that fit the observed waveform well, allowing the user to draw conclusions on the source time function, the source mechanism and source depth. Rather than choosing one best solution, the uncertainties of the STFs are propagated into travel time delay used in tomography. The ISC's Jennifer Weston finished the last session of the meeting by explaining how the EHB catalogue is being reconstructed and extended to create the ISC-EHB catalogue. The EHB catalogue consists of teleseismically well-constrained events selected from the ISC Bulletin that are widely used for tomographic inversions. The EHB catalogue ends in 2008 and the aim of the ISC-EHB is to extend the catalogue as well as to recreate it with new and more rigorous procedures for event selection, data preparation, processing and relocation. Closing remarks Alongside the busy speaker schedule, an active and diverse poster session provided the opportunity for many discussions. Furthermore, poster presentations from early-career researchers were judged: the two prizes sponsored by AWE (who also sponsored eight student conference grants) were awarded during the conference dinner, to Emily Crowder (University of Aberdeen) for her presentation of how to use dispersion analysis of ambient seismic noise for hydrocarbon exploration, and to Jennifer Jenkins (University of Cambridge) for her presentation on constraining Icelandic crustal structure using receiver function analysis. Overall, there was great support from the seismology community and feedback showed clear demand for another event. At the end of the conference, Anton Ziolkowski offered the University of Edinburgh as a potential next venue for BSM 2019 – we hope to see you there! ACKNOWLEDGMENTS We would like to thank the conference sponsors for their support: British Geophysical Association (BGA), AWE, Güralp Systems Ltd and Optics11. We are also grateful to the staff of Reading Town Hall and Museum, Jane Austen and Keith Andrews, and Sheila Peacock of AWE for her help and advice in organizing and planning this meeting. The organizer of the meeting, the ISC, is supported by 65 member institutions in 49 countries, including the National Science Foundation (Award 1417970) and several sponsors from both public and commercial sectors. FURTHER INFORMATION The BSM 2017 website holds talks and abstracts: http://bsm2017.isc.ac.uk © 2018 Royal Astronomical Society

Journal

Astronomy & GeophysicsOxford University Press

Published: Feb 1, 2018

There are no references for this article.

You’re reading a free preview. Subscribe to read the entire article.


DeepDyve is your
personal research library

It’s your single place to instantly
discover and read the research
that matters to you.

Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.

All for just $49/month

Explore the DeepDyve Library

Search

Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly

Organize

Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.

Access

Get unlimited, online access to over 18 million full-text articles from more than 15,000 scientific journals.

Your journals are on DeepDyve

Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more.

All the latest content is available, no embargo periods.

See the journals in your area

DeepDyve

Freelancer

DeepDyve

Pro

Price

FREE

$49/month
$360/year

Save searches from
Google Scholar,
PubMed

Create lists to
organize your research

Export lists, citations

Read DeepDyve articles

Abstract access only

Unlimited access to over
18 million full-text articles

Print

20 pages / month

PDF Discount

20% off