Statewide mineral potential mapping of New South Wales using a combined mineral systems and spatial data approach

Blevin, Phillip1; Downes, Peter1; Fitzherbert, Joel1; Ford, Arianne2; Peters, Katie2; Greenfield, John1

1Geological Survey of New South Wales, Department of Regional NSW, Maitland, Australia, 2Kenex Ltd, Wellington, New Zealand

The Geological Survey of New South Wales (GSNSW) has completed a four-year mineral potential mapping project across the major metallogenic provinces of NSW. These include the New England, eastern and central Lachlan and Delamerian orogens as well as the Curnamona Province. Mineral system models, specific to each province/event, were prepared for orogenic Au and Au–Sb, polymetallic volcanic associated massive sulfide, Broken Hill Pb–Zn–Ag, Cobar Pb‒Zn and Cu-Au, Macquarie Arc porphyry Cu–Au and intrusion-related Au and Sn–W systems. Modelling of the skarn potential of the eastern Lachlan Orogen was also undertaken to test the intersection between granite fertility with structural and reactive rock data to potentially model fluid flow and traps using essentially 2D data.

In excess of 10 million drillhole assays, 152,000 attributed faults, 31,190 mineral occurrences, 197,754 field observations and 883,967 surface geochemical assays were used in addition to the NSW Seamless Geology, and statewide geophysical and metamorphic datasets.

Where possible, a weights-of-evidence approach was used. Typically, over 100 valid predictive maps were generated for each mineral system to model source, transport, trap and depositional characteristics. Spatial relationships were tested with between 8 to 28 training points. Between 8 to 18 predictive maps were selected for each final model. The efficiency of classification for most models was better than 95%, with the prospective areas covering 3% to 16% of the relevant province and the highly prospective areas being significantly smaller.

Importantly, measures of data confidence were captured, and all polygons have attached metadata indicating the predictive layers used in their construction. There was interactive feedback at all stages between the Kenex spatial analysts and GSNSW mineral system specialists. The final reports, primary datasets, spatial data tables and data, including thousands of intrinsically useful predictive maps, are freely available for download from DIGS, with key spatial layers also accessible on the MinView platform. The prepared data outputs provide an ideal opportunity for testing revised and new mineral system models, or to add new data and explore alternate methodologies.

In areas under cover, geophysical mapping of secondary structures and intrusions was key in compensating for a lower data density. The associations of many mineral systems with gravity and magnetic worms at various depths is also intriguing, although the reason for this association is not yet clear. The correlation and understanding of these linkages through the crust will be a focus for future mineral potential mapping studies in areas devoid of relevant surface data.

A key learning from the project is the need to ensure that all datasets are in usable formats and fit-for-purpose prior to spatial modelling. This includes the standardisation of geological drill logs, stratigraphic unit descriptions and attributes in mineral occurrence and petrological databases. In the future, rock reactivity and permeability values, petrophysical data and magmatic fertility parameters combined with seamless geology, and fully attributed fault and metamorphic layers willallow for the construction of on-demand mineral potential maps viewable on MinView and permit the data to be exported into machine learning packages for real-time modelling on online platforms.


Phil Blevin (BSc – UNE, PhD – JCUNQ) specialises in understanding controls on the fertility of felsic magmas as well as related Sn(-W), W-Mo-Bi, granite related Au and porphyry Cu-Au mineral systems. He currently leads the mineral system (MinSys) team at GSNSW, including HyLogger operations.

Extending FaultSeg to Minerals Seismic: Part 1 – A model to generate synthetic seismic volumes modelling faults via elastic dislocation theory while incorporating multifractal features

Robindra Chatterjee1, Dion Weatherley2, Geoff McLachlan3 and Rick Valenta1,2

1The University of Queensland, Sustainable Minerals Institute, W.H.Bryan Mining and Geology Research Centre, Brisbane, Australia, 2The University of Queensland, Sustainable Minerals Institute, Julius Kruttschnitt Mineral Research Centre, Brisbane, Australia, 3The University of Queensland, School of Mathematics and Physics, Brisbane, Australia

Automatic seismic interpretation of 3D volumes used in minerals exploration is an under researched topic. We are conducting a three-part investigation to extend the state-of-the-art in automatic fault interpretation methods based on the FaultSeg3D algorithm developed by Wu et al. (2020, 2019). This approach has demonstrated superior performance in automatic fault and horizon picking on: (i) synthetic seismic volumes created to emulate the geological complexity for petroleum targeting present in sedimentary terranes; and (ii) publically available field surveys that were part of a petroleum exploration program. Our hypothesis is that training FaultSeg3D on this style of data leads to poor prediction performance of fault locations in 3D-seismic datasets surveyed over hardrock terranes that typically exhibit greater geological complexity. This is referred to as a model generalisation issue, that occurs when the joint distribution of the data present in the synthetic volumes used for training a machine-learning-based, fault-prediction algorithm are significantly different from that present in the field survey used for prediction. A 3D seismic volume from an operating gold mine in Queensland was provided for this study courtesy of Evolution Mining Ltd. (ASX: EVN) along with the 3D geological model and drilling database to validate results.

This research has developed a flexible, synthetic-seismic-volume generator that will be used to create more representative training data for a machine-learning-based, fault-prediction algorithm. Geological folding, faulting, post-process-filtering and acquisition noise are modelled via successive image-processing convolutions. The first feature of the model is the ability to accommodate a user-specified degree of geological complexity by adjusting the level of geological folding and faulting present in the target field-survey’s terrane for any size cube. The second feature of the model is a more physically realistic way to model three major kinds of faults based on elastic dislocation theory. The third feature of the model is the addition of multifractal fault and vein population characteristics that further impart geological realism. In the next phase of research we will train a deep-learning algorithm on several hundred synthetic-seismic volumes, that emulate the geological complexity at the gold mine, to produce an automatic-fault-surface prediction tool that could aid the targeting of fault-constrained mineralisation.


Robindra is completing the second year of his PhD at The University of Queensland.  His academic interests are in applied multivariate statistics, machine learning and 3D seismic. 

Propagating sparse basement markers through inversion volumes using graph convolutional neural networks

Gillfeather-clark, Tasman1, Horrocks, Dr Tom1, Wedge, Dr Daniel1, Holden, Prof Eun-Jung1

1Center of Exploration Targeting, School of Earth Sciences, University Of Western Australia, , Australia

A common problem in geoscience is spatially extrapolating sparse data, such as drill core logs or geochemical assays, in the presence of larger encompassing datasets such as inversion volumes. For example, a seismic-derived basement surface may be refined using a collocated physical model in which the basement and regolith are in high contrast, for example, an airborne electromagnetic (AEM) inversion volume. What makes this integration challenging for an interpreter or a machine learning approach is relating different datasets in a way that is both robust and scalable. While co-kriging is a scalable approach for this problem it requires variables to be consistently correlated, where graphs can address changing relationships. We propose the usage of graphs to understand the relationships between multiple datasets in 3D space.

A graph is a data structure comprised of entities (nodes) and connections between them (edges). Nodes can have attributes like values and labels, while edges can be directional or bidirectional, and weighted or unweighted. Graphs are underutilized in geosciences due to a lack of inherently graph structured data and relatively recent development of Graph Neural Networks. Our work shows the potential for graphs and graph neural networks to be applied to geoscience problems, as well as a general workflow to successfully implement a graph structure on 3D geoscience datasets.

We focus on basement delineation over a detrital iron deposit of the Fortescue valley in the Pilbara. Detrital iron deposits are accumulations of detritus eroded from a primary deposit (Koodaideri Iron) and trapped in basement depressions. Thus, the paleo-basement surface is crucial to the formation of the deposits. We use two key datasets: an AEM laterally constrained inversion volume (LCI) and a basement surface. This surface is produced using interpretation of: AEM inversion data, drilling, and seismic data.

Graph Neural Networks (GNNs) are powerful neural network architectures designed for graph data. The central paradigm of GNNs is called message passing, where prediction is made by ‘passing’ the embedded state of a node to its neighbours. This means that classification is controlled by the edges, we define instead of the attributes of the nodes. Further this allows us to connect nodes from different datasets that do not have common attributes, but that occupy the same space.

The nodes of our network come from the AEM LCI point cloud, where conductivity and uncertainty are discrete points in 3D space. We use the basement surface to label our nodes as either cover (above) or basement (below). The edges of the graph are generated using an adjacency matrix calculated by thresholding a cosine-similarity matrix, based on the z-score of the similarity value. Cosine similarity is a measure used to compare any two vectors. These vectors are generated based on relationships we consider important within our data such as spatial relationships (X, Y, Z), or conductivity depth relationships (COND, Z). We can combine and weight these matrices using the dot product for any number of similarity matrices.

Our preliminary results predicted with 93.5% accuracy if an inversion point was basement or cover in the 4200-point line using only 200 labelled points. The points that were misclassified, were mainly along the basement interface where the greatest uncertainty of label quality exists, based on conflicting interpretations. This represents a preliminary examination of our research problem and future work includes increasing the number of survey lines in the graph, different labelling regimes, the integration of additional datasets (drilling), and the propagation of more challenging labels (lithology).


Tasman is a PhD candidate within the Center of Exploration Targeting at UWA. He is studying the integration of Machine Learning and Geophysics under Professor Eun-Jung Holden, Dr Tom Horrocks and Dr Daniel Wedge.

Are giant ore deposits rogue waves or dragon kings?

1Hobbs, Bruce; 2Ord, Alison

1CSIRO, Perth Australia, 2The University of Western Australia, Perth, Australia

We address the question: Does the formation of giant ore hydrothermal deposits involve different process to the formation of average sized ore deposits? We draw analogies with the processes involved in mineralising systems with those used to explain the formation of rogue waves in oceanography and nonlinear optics and of tropical cyclones. Giant ore bodies constitute the tails of “fat tailed” distributions, or lie off the end of such distributions. It is important to distinguish between deposits that belong to a self-similar system (referred to as black swans by Sornette), and those that exceed what is expected from such a distribution (Sornette’s dragon-kings). It is proposed that similar processes operate for all members of a black swan distribution, and different processes operate via a phase transition for those that differ from the rest of the distribution. The nonlinear coupling between chemical reactions, deformation and permeability generation is discussed as a mechanism for forming regions of the Earth’s crust with anomalously high permeability. This nonlinear coupling between chemical reactions and deformation is identical to the physics used to explain rogue waves, tropical cyclones and dragon-kings. Thermodynamic arguments suggest the probability distributions for ore bodies at regional and microstructural scales should be fat tailed Fréchet distributions and it is shown that examples from natural mineralising systems conform to these distributions at a range of spatial scales. The implications for mineral deposit prediction and assessment using log-normal distributions and Zipf’s law are discussed together with the implications for kriging.


Bruce Hobbs is a structural geologist. His present interests are in applying the tools developed for nonlinear dynamical systems over the past 50 years or so to large data sets on alteration assemblages, deformation and mineralisation in mineralising systems in order to extract information of relevance to metal discovery.

Random Forest Based Mineral Potential Mapping for Porphyry Cu-Au Mineralisation in the Eastern Lachlan Orogen

Ford, Arianne1

1Kenex Ltd., Lower Hutt, New Zealand

Random forests represent a machine learning implementation of a decision-tree algorithm that can be applied to data-driven mineral potential mapping. Most published studies using random forests include relatively small numbers of input maps that are typically pre-classified by an expert familiar with the mineral system being targeted. The aim of this study was to investigate how random forests performed using different input parameters in terms of the individual predictive maps and training data. Four different implementations of the random forest algorithm were produced based on a case study using data from the eastern Lachlan Orogen in NSW for the purposes of targeting porphyry Cu-Au mineralisation related to the Macquarie Arc: (1) using a large number of multi-class categorical or non-thresholded predictive maps that have had no favourability criteria applied; (2) using a large number of binary predictive maps that have had statistically valid and geologically meaningful thresholds determined through weights of evidence analysis and expert review; (3) using a subset of the binary predictive maps that were used in a weights of evidence mineral potential mapping study; and (4) using this same subset of binary predictive maps with weighted training data. These results were then compared to the results of an existing weights of evidence mineral potential mapping study.

The results of the random forest analysis demonstrate how both the ranking of the input maps and subsequent mineral potential varies considerably depending on the degree of intervention from an expert in the modelling process. The first approach produced a prospective area that covered 47.7% of the study area, the second approach 6.5%, the third approach 23.4%, and the final approach with the weighted training data 40.4%. In comparison, the weights of evidence study produced a prospective area that covered 15.2% of the study area, however failed to predict one of the training points within this prospective area. Increasing the complexity of the input data improved the predictive capacity of the mineral potential maps for targeting the porphyry Cu-Au mineralisation when expert review was used to determine meaningful thresholds and classifications for the input predictive maps. However, when a large number of multi-class categorical or non-thresholded predictive maps were used as input to the random forest (i.e. no favourability criteria were applied, so the algorithm determined the thresholds rather than an expert), a poor result was obtained. The results also highlight that the main limitation of using random forests (and other machine learning approaches) for mineral potential mapping is the lack of a sufficient number of economically significant deposits which can be used to train a large number of input predictive maps.

The random forest study clearly demonstrates that the use of predictive maps that have statistically valid, geologically meaningful, and practically useful thresholds and reclassifications assigned produce more robust mineral potential maps that can be used for exploration targeting.


Arianne is a spatial data analyst whose focus is on the use of mineral potential mapping and spatial statistics for mineral exploration. She spent more than 10 years as an academic before moving to industry to work on delivering value-added geoscience data to both government organisations and the exploration industry.

About the GSA

The Geological Society of Australia was established as a non-profit organisation in 1952 to promote, advance and support Earth sciences in Australia.

As a broadly based professional society that aims to represent all Earth Science disciplines, the GSA attracts a wide diversity of members working in a similarly broad range of industries.