The SABLE® Core Module supports the definition and management of a Data Standard. This standard integrates the database structure, data capture, data and database validation, multi-user security, graphical reporting and powerful data management functionality. It also provides tools to track and audit data integrity and the database structure against the Data Standard.
The SABLE® methodology enforces transparency and audibility throughout all interfaces and functionality. These ethics are supported throughout all the other modules. Scaled, graphical reporting is supplied with the Core Module for the specialised translation and evaluation of qualitative, geoscientific data sets into quantitative symbols, colours, patterns, labels and charts which promote verification, interpretation and understanding of this complex, qualitative data.
The SABLE® Audit Manager module provides transparent tracking of each activity performed by a user through the CORE Module's data capture and data management functions. Each change to the Data Standard or security rights is recorded against user, date, time and session, unobtrusively and efficiently. The audit log analyser function allows the super user to interrogate the audit log from various angles to get a full picture of changes made over time.
Extract, Translate and Load Module
ETL or Extract, Transform and Load is international best practice for importing external data into a multi user data warehouse environment with the required quality assurance (QA). The ETL principles are fundamental to SABLE®'s approach for handling routine external data such as lab analyses, downhole surveys, geophysics, etc.
The first stage is to ‘Extract’ the external data from a third party data format such as an ASCII file or database table.
The second stage is ‘Transform’ when the extracted data is translated, evaluated, formatted, mapped and migrated into clearing tables in the database. Once in the clearing tables, the third party data can be viewed and validated using SABLE® Data Warehouse tools. Apart from the validations, which are configured per data set, computed / virtual fields are defined to query the destination tables which will receive the data ahead of the load. In this way the data is screened through a firewall before the final ‘Load’ stage.
Once each submission / batch of data has been cleared, it is loaded automatically into the correct position in the final data structure.
Statistical Analysis Toolkit
The Statistical Analysis Toolkit module is used for Quality Control reporting.
Required statistical populations are defined as data sets by filtering and mapping them to the database structure. These data sets include control samples for field and laboratory batches such as certified standards, blanks and duplicate samples.
Classical statistical parameters are produced in order to verify normal distributions. Charts include histograms, scatter plots, regression lines, control samples within expected certified ranges, variance per standard within a single batch and differences between original and duplicate sets relative to mean.
Tests include F to compare variances of two populations and T for means of sample populations.
Non compliant batches can be reported. Graphs, text and tables are grouped to compile reports. Reports are generated by user and date,ready for auditors, annual reports, resource & reserve estimation and production QA/QC by batch.
This module can be run on analytical data at any point in the drillhole or project life cycle.
It is recommended that this QC functionality be run routinely on each analyte and lab samplebatch as it passes through the ETL clearing tables as a part of the QA firewall. Batches shouldonly be released into the primary database structure if QC is accepted. This can be defined asone of the sign offs in the QM Module.
Sample Tracking Module
The Sample Tracking module provides the software implementation of field sample management for physical sample tracking and reporting.
Sample positions (depths) are planned and captured in the database, unique sample ticket identifiers, laboratory request sheets and sample dispatch forms are generated off the database in order to prevent transcription errors.
Batches of samples are moved and tracked through the sampling and dispatch procedure.
Unique sample identifiers as well as barcode printing are supported.
The system is able to issue lab analysis request forms, flag the current state of each sample and interface to the ETLQA Module to manage lab receipts.
Quality Management Module
The Quality Management module supports a user sign off process within the database itself.
It consists of sets of fields which are restricted to particular users within their own active data branches (e.g. boreholes / working places) for personal accountability.
A work procedure is defined for each data set. This is related to sign off points which are allocated by skills level so that competent users authorise for juniors at strategic points.
Finally, the most senior manager s sign off triggers a status change on the data set within the database from active to signed off and it becomes available for publication and reporting. This sign off process is supported for easy management.
The CoordGEN module generates spatial positions. It uses borehole or any profile's collar and/or directional information to generate X,Y and Z co-ordinates at the top, middle or bottom of units taken from any SABLE® table. Result data sets are interfaced directly off the database to third-party applications through customised views and ODBC connections.
Spatial representation improves
understanding of geoscientific data. The positions of samples and descriptions of material from the earth s crust can be co-ordinated from known survey stations and measured offsets. If this is carried out at the database level, the process is transparent and auditable and the same spatial locations will be served to any third party spatial modelling, CAD, GIS and other visualisation tools. SABLE®'s CoordGEN Module provides a number of advantages over traditional practice of desurvey within modeling applications as well as graphical, mapping techniques.
A transparent, flexible interface to configure the calculation directly off the primary measurements and readings. This avoids the reformatting and creation of multiple versions of the same data to match the needs of external applications.
Source data is validated before its use in the calculation, which can be performed immediately. Errors can be discovered and rectified at from source at short notice. This enforces good quality data both for the calculation and for later use of the data. If the co-ordinates are generated and validated externally, there is no guarantee that corrections are fed back to the primary data set. This results in bad data management practice.
In addition to the point surveyed, CoordGen supplies co-ordinates at any point or at the top, middle and bottom of an interval within distance based data sets.
Handles multiple deflections, originating both from the mother hole and within other deflections, natively. Top-of-deflection co-ordinates, depth and distance -based co-ordinates can be generated.
Product Prediction Module
The current global market requires that mineral resources be utilised to produce more than one product. Mining cuts / zones can be predetermined through theoretical product based on significant criteria so that optimal product yields can be estimated for a possible range of products.
This module provides the display and printing of the geochemical /geometallurgical curves which can be used to forecast required product yields. The results can be viewed on customised, scaled graphical reports together with geological, sampling and grade / quality parameters.
Computed results are written back to the database to bank knowledge, provide auditability and transparent interfacing of the data to reporting tools, CAD and modelling applications and other third party software.
The GradeCON module is used in conjunction with the Product Prediction Module (PrePROD) to provide geological predictions of mining elevations and product grades /qualities and yields ahead of mining. If the exploration drillhole data has been verified as reliable and the mine geologist is confident that there is enough data and detail, this existing data set can be used to predict the mining elevations to support planning and scheduling. But if this data is too sparse, chip and/or percussion sampling must be conducted to supply verifiable data through a procedure which includes auditable quality management. This platform requires the GradeCON Module for product forecast into mining units.
The very same ETLQA, SATQC and WorkQM modules that are used to manage drill holes will be
used to manage quality assurance on the additional sampling, laboratory analyses and data flows
through the system.
After mining has taken place, GradeCON is used to recalculate product and yield estimates based on mined elevations. The same modules are used as before but this time the quality management will be implemented to support the survey and face measurement data sets. These estimates can be used for grade accounting purposes.
Prediction of Mining Elevations Module
Rock Mass Ratings Module
Health and safety regulations are being applied more and more stringently to reduce legal risk and prevent accidents during mining. International best practice is to estimate rock quality and competence pro-actively and apply it during mine design stages rather than re-actively during support design. This module manages the mapping of rock quality parameters to rock mass quality factors which are rated according to international standards and combined to produce rock mass quality estimates. These estimates can be interfaced to CAD and spatial modelling packages for viewing and display to aid in safety and mine design.
Remote Manager Module
The SABLE® methodology requires validation at the point of capture if there is no hardcopy capture form. This enforces the work practice of validating the captured data against the data standard held in SABLE before the user leaves the mapping / logging site. For this reason, we do not support an offline solution based on PDA s which do not have enough resources to support the necessary validations and verification.
We recommend the use of SABLE® on a new generation tablet PC such as a Panasonic Toughbook. These PCs are rugged, can be shared by users and are almost indestructible.
Where capture must take place away from a network, SABLE® is installed with a SQL Server Express back end. The TRANSit Module will manage the remote capture of data away from the central database, preventing version control issues. SABLE®'s Data In Transit technology is used to transfer data between the remote and central databases.