In the software, engineers generate wells, wellbores, well designs, technical calculations and experiences, all of these are available through our API, with personal history. The amount of data we create in well planning are magnitudes larger than the manual process.
Then we make all the data available through API’s, so you can integrate to any other application or storage.
Returning data to the original databases becomes a question, when users improve data or we take in multiple sources of one dataset, how will you handle this in your other storage which only stores one truth? Should we update your storage on changes and then read from your storage again? Our experience is that this is slow and prone to unexpected events. We have to be responsible for a robust data improvement process with high quality data, no user accepts that previous fixes returns as problems.
In the next stages, we can collaborate on data quality measures so we can have live data sharing with data lakes or other, but we need guidance from you on how to achieve this.