AnalysisCenter

We are a data science team who brings into the industry deep learning capabilities to provide super-human performance for unprecedented returns.

We jump-started from developing medical deep learning frameworks radio and cardio that are being used now by research and educational institutions all over the world.

Our current focus is oil and gas upstream, including seismic and petrophysics processing and interpretation. With machine learning at hands we aim to equip industry experts with fast and accurate algorithms as well as objective metrics to exhausitively assess quality and performance.

What we do

SeismicPRO

Essential library made specifically for ML models with seismograms. Machine learning for field seismic data processing developed in ready-to-use set of tools for noise attenuation, first-break peaking, spherical divergence correction and more.To simplify your routine it provides tools to load, process and visualize pre-stack seismic data in different representations, such as raw seismic traces, common shot gathers, etc.

SeismiQB

SeismicQB facilitates deep learning research on 3d-cubes of seismic data. It contains tools for convenient data loading, preprocessing and adjusting ML pipelines for 3d seismic interpretation (e.g. horizon picking, faults detection, reef detection, etc). The included preprocessing methods range from common distortions (like additive or multiplicative noise, flips, rotations, etc) to complex geological transformations into different domains (namely, magnitude, frequency or phase).

Petroflow

Most efficient and diverse tool for working with well and core data. With this tools one can conveniently train machine learning models, such as core-to-log matching, porosity prediction, collectors detection by logs and many others.

PyDEns

Pydens enables solving Ordinary and Partial Differential Equations (ODEs & PDEs) using neural networks. This task is of crucial importance in fluid modelling in an operating oil rig, which are currently done with regular PDEs solvers, that are simply not fast enough for decision making process.

Batchflow

Core library for all our stack as it sets format, approach and notation to everything we deliver in the world. Batchflow provides all necessary features for parallelism, optimal GPU usage, powerful research on variety of state-of-the-art neural networks architectures and speeding up overall performance.

Standards

Key ingredient to reproducibility and scalability of your ML project. We accumulated best practices from state-of-the-art deep learning research and defined principles for code quality, dataset annotation, model description and evaluation, its'criticism and hyperparameters selection with a final touch of data versionising and standardized project structure.

Together with our processing experts, we have developed quality control criteria for the entire survey area after each processing step

The first step in finding an oil reservoir is building an accurate picture of the subsurface by raw seismic data. We've created a set of deep learning models, covering a part of a typical seismic processing pipeline, such as first break picking and geometry assessment. Also, we've developed and implemented a number of quantitive metrics to automatically perform quality control of the results after each processing step.

Detected horizons rigorously follow the tracked phase and meet all the requirements of seismic and geology specialists

After a post-stack cube is obtained, experts start locating objects of interest, such as horizons and faults. Our neural networks solve these tasks orders of magnitude faster with a super-human quality. Moreover, the proposed approach allows detecting areas of low picking quality, as well as data anomalies like noise and low signal ratio.

Several hours instead of weeks for complete porosity prediction

In order to draw conclusions on a reservoir, we need to know not only its location but also its physical properties. Our model for porosity prediction by well logs discovers more complex dependencies, than a classical petrophysical approach, which results in a 20% performance gain. Additionally, we've optimized several routine data preprocessing operations, which allowed us to significantly speed up the process of well data interpretation.

We stay open - always

After we get consistent results in each development branch we publish transparent and rich in details paper. In 2020 top picks are:
SeismiQB -- a novel framework for deep learning with seismic data
Seismic horizon detection with neural network
Metric for evaluating difference between seismic gathers (to be published late October)

And after that, of course, we strike down upon thee with great vengeance and furious anger. Fortunately, there is always "Yet another Hackathon" to lay out hands on.

But after we get tired of ruling wonders all day long, we simply get together with great groups and distribute knowledge in best ways there are: "EAGE Practical Sessions"
"Open Data Science Community Support"
And of course end-the-day classical "Science Talk in a Bar"

Get in touch

If you have been in the field as long as we are, then you know - it gets lonely in here.
So, if you have in mind a proposal for collaboration or simply would like to let us know what you think, simply drop us an email.