In 2010, we started automated scraping of data to create maps that haven’t been made before

Google Cloud’s platform allowed for continuously-operating networks of databases and servers that could handle the hard work of finding, curating, and compiling world-wide datasets. R, Python, Golang, Geoserver and OGC formed the foundation of a new era in GIS.

By 2014, exploratory factor analysis became a standard operating procedure for looking at data’s inner relationships

The way data interrelates is the foundation of our geoscience. Through analyzing geophysics, soil and rock geochemistry and structural patterns, we’ve discovered two important things: 1) our results confirm many of the the researchers’ that have come before us (eg. advanced argillic assemblages), and 2) we can easily enhance this insight to discover the unique characteristics of your hydrothermal system.

In late 2020 we added machine learning to classify complex data, and then let it run unsupervised

The first test was using K-Neural Networking and supervision to classify hyperspectral readings on drill core. What we discovered: this analysis is far superior and more in-depth to traditional classification methods available in most over-the-counter software.

Now we have confidence in the system, we’re applying numerous techniques to observe multitudes of datasets… uncovering new relationships that are rapidly pushing boundaries of exploration and geoscience.

Today, we are using these tools to enable discovery

Data analysis should be fast, easy, and solve a problem. Our system is built to handle inbound data that isn’t pure and clean (the real world). AI, ML, and big data are acronyms and buzz words that should stem from a single foundational concept: use this new intelligence to improve mineral targeting, generate success, and save money.