Editorial disclosure: This article reflects the independent analysis and professional opinion of the author, informed by published research, vendor documentation, industry surveys, and practitioner experience. No vendor reviewed or influenced this content prior to publication.
This is the article that every VP of Engineering, data analyst joining an E&P company, and IT director evaluating BI tools eventually searches for. The question comes up in every data modernization conversation: which business intelligence platform should we standardize on?
The honest answer is more nuanced than any vendor will tell you, and it starts with a fact that surprises people outside the industry: Tableau -- the most popular BI tool in the world by many measures -- is a minor player in upstream oil and gas. The E&P BI market is a two-horse race between Microsoft Power BI and TIBCO Spotfire, with a growing field of open-source and operational tools filling gaps that neither fully covers.
This article is a practitioner-level comparison based on what operators actually deploy, not what vendors pitch at conferences. If you are evaluating BI tools for an upstream operation, or trying to understand why your current tool is failing your engineering team, this is the breakdown you need.
The Big Three (and Why It Is Really the Big Two)
Every enterprise BI evaluation starts with the same three names: Power BI, Tableau, Spotfire. In most industries, Tableau and Power BI dominate, and Spotfire is niche. In upstream oil and gas, the picture is reversed. Spotfire and Power BI co-dominate E&P analytics, and it is Tableau that occupies the niche position.
This is not arbitrary. It reflects the specific nature of petroleum engineering data: time-series sensor data from thousands of wells, spatial data tied to lease boundaries and well locations, engineering calculations that require statistical regression and curve fitting, and workflows built by engineers who think in exceptions rather than dashboards.
Per the E&P Software Survey and industry analysis, the market breaks down like this:
| Tool | Market Position in E&P | Primary Users |
|---|---|---|
| Spotfire (TIBCO / Cloud Software Group) | Co-dominant | Petroleum engineers, geoscientists, reservoir engineers |
| Power BI (Microsoft) | Co-dominant | Operations teams, management, finance, data analysts |
| Tableau (Salesforce) | Minor presence | Finance teams, corporate reporting, non-technical groups |
| Grafana | Growing for operations | Production monitoring, DevOps-adjacent teams |
| Qlik | Niche | Some adoption, less common than Spotfire/Power BI |
A key finding from our research: a surprising number of E&P companies end up running both Spotfire and Power BI. Spotfire for technical engineering analysis. Power BI for enterprise distribution and executive reporting. This is not a failure of strategy -- it reflects a genuine difference in what the tools are good at.
Power BI: The Microsoft Default That Actually Works
What It Does Well
Power BI has become the default BI tool for mid-size E&P operators for reasons that have nothing to do with petroleum engineering and everything to do with the Microsoft ecosystem. If your company already runs Microsoft 365, Azure, and SharePoint -- and the vast majority of operators do, given that Azure dominates O&G cloud with roughly 57% market share per Kimberlite survey data -- then Power BI is already in your software budget at $10 per user per month for Pro licenses.
Strengths for E&P:
- •Cost. Power BI Pro is $10/user/month. Premium Per User is $20/user/month. For a 50-person organization, you are looking at $6,000 to $12,000 per year for full BI capabilities. This is a fraction of what Spotfire or Tableau costs.
- •Microsoft ecosystem integration. Connects natively to Azure SQL, Azure Data Lake, Databricks on Azure, SharePoint, and Teams. This matters because the majority of O&G operators are on Azure.
- •Distribution and sharing. Power BI's strength is not in building complex analyses -- it is in publishing them. A dashboard built by one data analyst can be embedded in Teams, shared via a link, scheduled for email delivery, and accessed on mobile. This is what makes it the go-to tool for morning reports and executive dashboards.
- •DAX and Power Query. The data modeling layer is genuinely powerful. DAX expressions can handle complex calculations, time intelligence, and business logic that would require custom code in other tools.
- •Ease of use for non-technical users. A business analyst or production engineer can build a functional dashboard in Power BI in days, not weeks. The learning curve is the lowest of the three major tools.
- •Community and templates. The Power BI community is massive. Templates, learning resources, and consultants are abundant and affordable.
Who uses it: Most mid-size operators (Permian Resources uses it alongside Spotfire), small operators who can only afford one tool, and essentially every large independent and supermajor for executive reporting and enterprise distribution.
Where It Falls Short
Power BI is not an engineering tool. It was designed for business analytics, and it shows in several ways that matter to petroleum engineers:
- •Limited real-time capability. Power BI's streaming datasets and push datasets offer near-real-time, but they are not true real-time in the way that SCADA systems or Grafana dashboards are. Refresh schedules are typically 30 minutes at minimum for import models, and DirectQuery has performance limitations with large datasets.
- •Statistical analysis is basic. Power BI can do trend lines and basic statistics. It cannot do multivariate regression, decline curve analysis, or the kind of engineering-grade statistical work that Spotfire handles natively.
- •Visualization flexibility is limited. Power BI's charts are clean but rigid. Custom visuals from the marketplace help, but they lack the interactive, drill-anywhere flexibility that engineers are accustomed to in Spotfire.
- •Not built for exploration. Power BI excels at structured, pre-built dashboards. It is less effective for ad-hoc data exploration where an engineer wants to throw 15 columns against each other to find patterns. This is where Spotfire dominates.
- •Row limits on large datasets. Import mode has a 1GB compressed data limit per dataset (more in Premium). For operators with millions of rows of high-frequency sensor data, this requires careful data modeling.
The honest verdict on Power BI: It is the right choice for 80% of what an E&P company needs from BI -- morning reports, KPI dashboards, executive reporting, and data distribution. It is the wrong choice for the 20% that petroleum engineers care most about: ad-hoc technical analysis and engineering-grade visualization.
Spotfire: The Engineer's Tool That Refuses to Die
Why E&P Engineers Love It
Spotfire has been embedded in petroleum engineering workflows since the mid-2000s. It dominated E&P analytics in the early 2010s, and despite the rise of Power BI and despite TIBCO's corporate upheaval (acquired by Vista Equity, merged with Citrix into Cloud Software Group), Spotfire retains a loyal following in upstream because it was purpose-built for the kind of analytical work that petroleum engineers do.
Strengths for E&P:
- •Built for engineering data exploration. Spotfire's core design philosophy is interactive visual analysis. An engineer can load a dataset, create a scatter plot, color it by one variable, size it by another, filter by a third, and drill into individual data points -- all in a fluid, responsive interface.
- •Advanced statistical analysis. Native support for multivariate regression, clustering, decision trees, and curve fitting. Decline curve analysis, type curve matching, and production forecasting can be done directly in Spotfire without exporting data to R or Python.
- •O&G-specific capabilities. Map charts with lease boundary overlays, well symbol maps, cross-plots that petroleum engineers expect. Spotfire has 20 years of O&G-specific feature development that newer tools lack.
- •Integration with technical data sources. Connects to PI historians, OFM, Petrel, and other engineering data systems. Spotfire has pre-built connectors and workflows for the tools that E&P companies actually use.
- •TERR (Spotfire Statistics Services). Embedded R and Python execution engine. Engineers can run statistical scripts directly within Spotfire visualizations, which is something Power BI can technically do but not nearly as seamlessly.
- •Collaboration and sharing (improved). Spotfire Business Author and Consumer licenses, plus Spotfire Cloud, have improved the distribution story. Still not as frictionless as Power BI, but much better than five years ago.
Who uses it: Supermajors (alongside Power BI), large independents, and mid-size operators with strong engineering cultures. Permian Resources uses Spotfire for technical analysis alongside Power BI for enterprise reporting. Most operators with 1,000+ wells have at least some Spotfire licenses.
Where It Falls Short
- •Cost. Spotfire is expensive. Analyst licenses can run $1,500 to $3,000+ per user per year, depending on deployment model and negotiation. For a team of 20 engineers, you are looking at $30,000 to $60,000+ annually -- before server costs if on-premise. This is 5-10x the cost of Power BI.
- •Steep learning curve. Spotfire is powerful because it is complex. Getting real value out of it requires training, and training petroleum engineers on BI tools is like pulling teeth. Many Spotfire deployments end up being used by two or three power users who build dashboards for everyone else, which undermines the tool's core strength of self-service exploration.
- •Distribution is weaker. Sharing Spotfire analyses with non-technical users -- executives, field personnel, business partners -- is harder than with Power BI. Embedding in Teams or SharePoint is possible but not native. The Consumer license helps, but the overall experience is less polished than Power BI's publishing workflow.
- •TIBCO's corporate situation. Cloud Software Group (the parent company after the Citrix/TIBCO merger) carries significant debt. The product continues to receive investment, but the corporate instability makes some IT departments nervous about long-term commitment.
- •Cloud transition is ongoing. Spotfire Cloud exists, but many E&P deployments are still on-premise. The migration to cloud-native is happening but is behind Power BI and Tableau in maturity.
The honest verdict on Spotfire: It remains the best tool for petroleum engineering analysis -- full stop. No other BI platform matches its combination of statistical depth, data exploration flexibility, and O&G-specific features. But it is expensive, hard to learn, and increasingly hard to justify as the sole BI platform when Power BI handles 80% of use cases at a tenth of the cost.
Tableau: The World's Most Popular BI Tool That E&P Barely Uses
This section will frustrate Tableau advocates, but the data is clear: Tableau is not a primary choice in upstream oil and gas. It is a strong tool -- arguably the best general-purpose BI platform ever built -- but upstream E&P is not a general-purpose environment.
Why Tableau Struggles in E&P
- •Designed for business analytics, not engineering data. Tableau's sweet spot is sales dashboards, marketing funnels, customer analytics, and executive reporting. Petroleum engineering data is messy, high-frequency, multi-dimensional, and requires domain-specific visualizations that Tableau does not offer out of the box.
- •Weak time-series handling for operational data. Tableau can chart time-series data, but it lacks the native capabilities for handling SCADA-frequency data (1-second to 1-minute intervals), historian data with irregular timestamps, and the kind of time-intelligence calculations that production monitoring requires.
- •No native statistical analysis. Tableau can do trend lines and basic calculations. It cannot do multivariate regression, decline curve fitting, or the statistical analysis that petroleum engineers need.
- •No O&G ecosystem. Tableau does not have pre-built connectors for PI historians, OFM, or other E&P-specific data systems. It does not have O&G-specific map overlays, well symbol libraries, or engineering visualization templates.
- •Cost without justification. Tableau Creator is approximately $75/user/month ($900/year). Explorer is $42/user/month ($504/year). This is more expensive than Power BI and less capable than Spotfire for E&P use cases, putting it in an awkward middle ground.
Where Tableau IS Used in O&G
Tableau does appear in oil and gas companies, but typically in departments that look like any other enterprise:
- •Finance and accounting teams who came from non-O&G backgrounds and brought Tableau with them.
- •Corporate reporting where the data is structured business data (revenue, OPEX, headcount) rather than engineering data.
- •Midstream and marketing groups that deal with commodity pricing and logistics, where the data looks more like standard enterprise data.
- •Companies where a Salesforce relationship already exists and Tableau came bundled.
The honest verdict on Tableau: If you are evaluating BI tools for E&P operations, Tableau should not be on the shortlist. It is a fantastic tool that solves problems the upstream industry does not have. If your finance team already uses it, let them keep it -- but do not try to force it onto production engineers.
The Other Contenders Worth Knowing
The BI landscape in E&P is not limited to the Big Three. Several tools are carving out meaningful niches, and two in particular are worth serious evaluation.
Grafana: The Open-Source Disruptor
Grafana has emerged as a legitimate production monitoring tool in E&P, driven by operators who want real-time operational dashboards without the cost or complexity of Spotfire.
Case study: Whiting Oil and Gas built 300+ Grafana dashboards with 200+ active users for well monitoring. This is not a pilot. This is a production deployment at scale.
Why Grafana is gaining traction:
- •True real-time. Grafana was built for time-series monitoring. It handles 1-second refresh rates without breaking a sweat, which neither Power BI nor Spotfire can match for operational monitoring.
- •Free (open-source core). Grafana OSS is free. Grafana Cloud starts at $0 for small deployments. Even the enterprise tier is cheaper than Spotfire.
- •Connects to everything. Native connectors for InfluxDB, TimescaleDB, Prometheus, PostgreSQL, and critically, AVEVA PI System via the PI Web API. This means Grafana can sit directly on top of the historian that 85% of top O&G companies already run.
- •Alerting built in. Grafana's alerting system can trigger notifications based on real-time thresholds -- exactly what exception-based monitoring requires.
Limitations: Grafana is not a BI tool in the traditional sense. It lacks the data modeling, DAX-equivalent calculations, and polished reporting that Power BI provides. It is a monitoring tool that complements, rather than replaces, a BI platform.
Qlik
Qlik maintains a niche presence in O&G, with some operators using Qlik Sense for associative data exploration. Its associative engine -- which highlights relationships across all dimensions simultaneously -- is genuinely different from the filter-based approach of Power BI and Tableau. However, adoption in upstream E&P is limited compared to Spotfire and Power BI, and the vendor ecosystem is smaller.
Custom Web Dashboards (Python Dash, Streamlit, Plotly)
A growing number of E&P data teams are building custom dashboards using Python-based frameworks:
- •Streamlit for rapid prototyping and internal tools. A data engineer can build a functional well performance dashboard in a day.
- •Dash (by Plotly) for production-grade web applications. More complex than Streamlit but more robust.
- •Panel (HoloViz) for teams already using Python's scientific computing stack.
These tools are particularly popular with teams that have data engineers or data scientists but limited BI budgets. The trade-off is clear: total flexibility and zero license cost, but you need developers to build and maintain everything.
What Production Engineers Actually Need from BI
Before choosing a tool, it helps to understand what production engineers actually do with dashboards. We covered this in depth in our production dashboard design guide, but the core requirements are worth restating because they drive the tool selection decision.
The Non-Negotiable Capabilities
Exception-based monitoring. Production engineers do not want to look at 500 wells that are running fine. They want to see the 12 wells that are not. Any BI tool that leads with aggregate metrics and buries the exceptions is useless for daily operations. This means the tool needs robust filtering, conditional formatting, and ideally anomaly detection or threshold-based highlighting.
Well-level drill-down. A production engineer who sees a well flagged as underperforming needs to drill into that well's production history, pressure trends, chemical injection records, and recent workover history -- all from the same interface. This requires either a well-modeled data architecture behind the dashboard or a tool flexible enough to handle ad-hoc queries against multiple data sources.
Real-time SCADA integration. If the BI tool cannot display data fresher than 30 minutes, it is not a production monitoring tool. It is a reporting tool. There is nothing wrong with reporting tools, but do not pretend they are monitoring tools. True production surveillance requires near-real-time data from SCADA systems like CygNet, zdSCADA, eLynx, or Ignition.
Mobile access for field use. Pumpers and field technicians need to see well data on phones and tablets at the wellsite. This means responsive design, offline capability (or at minimum, low-bandwidth tolerance), and interfaces simple enough to use in bright sunlight wearing gloves. Power BI mobile is serviceable. Spotfire mobile exists but is less polished. Grafana mobile works but is not pretty. Dedicated field apps (GreaseBook, ProdView Go, eLynx mobile) often win here.
Time-series visualization. Production data is inherently time-series. The tool must handle irregular timestamps, multiple Y-axes, gap handling for shut-in periods, and overlay capabilities for comparing wells or time periods.
Connecting BI to the Data Stack
A BI tool is only as good as the data feeding it. The most common frustration we see is operators who invest in Power BI or Spotfire but never solve the data plumbing problem, resulting in dashboards that display stale, incomplete, or incorrect data.
The Typical Data Flow
For most mid-size operators, the data flow looks like this:
[SCADA / Field Systems] → [PI Historian] → [SQL Server or Data Lake] → [BI Tool]
For more progressive operators (Permian Resources being the reference example):
[SCADA] → [Cloud Data Lake] → [Databricks/Snowflake] → [dbt transforms] → [Spotfire + Power BI]
Power BI Data Connectivity
Power BI offers two primary modes for connecting to data sources:
Import Mode. Data is loaded into Power BI's in-memory engine (VertiPaq). Fastest query performance, supports full DAX calculations, but data is only as fresh as the last refresh. Scheduled refresh can run up to 8 times per day on Pro, 48 times per day on Premium. This is fine for daily morning reports. It is not fine for real-time monitoring.
DirectQuery Mode. Queries are sent directly to the source database in real-time. No data staleness, but query performance depends entirely on the source system. If your source is a slow SQL Server with no indexing, your dashboard will be slow. DirectQuery also limits some DAX functions and visual types.
Composite Mode. Combines Import and DirectQuery tables in the same model. This is the right architecture for most E&P deployments: import historical data for fast analysis, DirectQuery for recent/real-time data from the historian.
Connecting to PI System. AVEVA PI System does not have a native Power BI connector. The standard approach is to use PI SQL Data Access (OLEDB Enterprise) to query PI data from Power BI via DirectQuery or import. Alternatively, pipe PI data into Azure Data Lake or Snowflake using PI Integrator for Azure, and connect Power BI to the cloud destination. The second approach is better for performance and scalability.
Spotfire Data Connectivity
Spotfire connects to PI historians via ODBC/OLEDB or through Spotfire's Information Services layer, which can be configured to query PI AF and PI Archive directly. Spotfire also supports in-database analytics, meaning it can push computations to the database rather than pulling all data into memory -- important for large sensor datasets.
Refresh Strategies by Use Case
| Use Case | Recommended Approach | Refresh Frequency |
|---|---|---|
| Daily morning report | Power BI Import | Twice daily (5 AM and 8 AM) |
| Production surveillance | Power BI DirectQuery or Grafana | Real-time to 5 min |
| Engineering analysis | Spotfire connected to data lake | On-demand |
| Executive dashboard | Power BI Import | Daily |
| Well alarm monitoring | Grafana or dedicated SCADA UI | Sub-minute |
| Decline curve analysis | Spotfire or custom Python | On-demand |
The AI Layer on Top of Dashboards
Dashboards display data. The next evolution is dashboards that interpret data. The AI layer on top of BI tools is moving from experimental to practical, and it changes the calculus on tool selection.
Anomaly Detection
The most immediately valuable AI capability for production dashboards is anomaly detection: automatically flagging wells whose behavior deviates from expected patterns. This is exception-based monitoring on autopilot.
Power BI offers built-in anomaly detection in line charts (since 2021). It is basic -- it uses a time-series decomposition model -- but it works for simple cases like detecting unexpected production drops. For more sophisticated anomaly detection (multivariate, accounting for well interactions, seasonal patterns), you need external ML models feeding flags into the BI layer.
Spotfire's TERR engine can run R or Python anomaly detection scripts directly within dashboards, offering more flexibility but requiring more skill to implement.
The practical architecture for most operators:
[Production data] → [ML model (Azure ML, Databricks, or custom Python)]
↓
[Anomaly flags written to database]
↓
[Power BI / Spotfire displays flags alongside production data]
Automated Commentary
The emerging capability that will change daily workflows is AI-generated natural language commentary: dashboards that not only show you the data but tell you what changed and why it might matter.
"Well 14-7H production dropped 23% overnight. Gas-oil ratio increased 40% over the same period. Pattern is consistent with gas breakthrough or artificial lift failure. Three nearby wells show normal production, suggesting this is a wellbore issue rather than a reservoir issue."
This capability is being built into enterprise AI platforms (Cognite Atlas AI, SLB Tela) and can be implemented with LLMs connected to production databases. Power BI's Copilot (Microsoft 365 Copilot integration) offers a simplified version, though its E&P-specific reasoning is limited.
Predictive Alerts
Beyond detecting what has already happened, predictive models can warn of what is about to happen: wells trending toward failure, artificial lift systems approaching operating limits, or injection patterns suggesting impending frac hits. These predictions surface as alerts in dashboards or, increasingly, as notifications pushed to mobile devices and field teams.
The most effective implementations we have seen combine simple statistical models (moving averages, control charts) with domain-specific rules engines. Full machine learning models add value for complex patterns but require more data and expertise to maintain.
By Company Size: What Makes Sense Where
Supermajors (ExxonMobil, Chevron, Shell, BP, TotalEnergies)
Typical BI stack: Power BI + Spotfire + custom web dashboards. Budget is not the constraint -- capability and integration are.
Supermajors run both Power BI and Spotfire without hesitation. Power BI handles enterprise reporting, executive dashboards, and cross-functional analytics. Spotfire handles petroleum engineering analysis, reservoir characterization, and geoscience workflows. Custom web dashboards (often built on Python or proprietary platforms) handle specialized operational use cases.
Annual digital/IT spend: $100M to $1B+. The BI tool license cost is a rounding error.
Large Independents (Devon, EOG, Diamondback, ConocoPhillips)
Typical BI stack: Power BI + Spotfire.
Same dual-tool approach as supermajors, but with smaller teams. These companies typically have 5-15 dedicated data analysts or data engineers who build and maintain dashboards for hundreds of users. Spotfire is used by the engineering teams. Power BI is used for everything else.
Annual digital/IT spend: $20-100M.
Mid-Size Operators (1,000-10,000 wells)
Typical BI stack: Power BI, possibly with some Spotfire licenses.
This is where the decision gets real. Mid-size operators -- companies like Permian Resources, Matador, Crescent Energy, Ring Energy -- have IT budgets of $5-20M annually. They cannot casually deploy both tools at scale.
The pattern we see most often: Power BI as the primary platform, with 5-10 Spotfire Analyst licenses for the most technical users (usually in reservoir engineering or production analysis). The Spotfire users build analyses that are then simplified and republished in Power BI for broader consumption.
Permian Resources is the reference case: they run Databricks as their data platform, with Dagster for orchestration, dbt for transformations, and both Spotfire and Power BI for visualization.
Small Operators (Under 1,000 wells)
Typical BI stack: Power BI or Excel. Possibly Grafana for monitoring.
Small operators with IT budgets of $100K-$2M cannot justify Spotfire's cost. Power BI at $10/user/month is the right tool. For companies with technical founders or data-savvy engineers, Grafana connected to eLynx or zdSCADA SCADA data provides real-time monitoring at no license cost.
The realistic stack for a small operator:
- •eLynx for SCADA ($10/asset/month)
- •GreaseBook or OGsys for production tracking
- •Power BI Pro for dashboards ($10/user/month)
- •Excel for everything else (it is not going away)
The Decision Framework: When to Use Which
After years of working with E&P operators across the size spectrum, here is the honest recommendation:
Choose Power BI When:
- •You need to distribute dashboards broadly across the organization.
- •Your primary use case is production morning reports and executive reporting.
- •You are on Microsoft Azure and the 365 ecosystem (you probably are).
- •Budget is a constraint and you need one tool to cover the most ground.
- •Your users are business analysts, operations managers, and executives rather than engineers doing deep analysis.
- •You need mobile access for field personnel.
Choose Spotfire When:
- •You have petroleum engineers and geoscientists who need to do ad-hoc data exploration.
- •Statistical analysis (regression, clustering, decline curves) is a core workflow.
- •You need O&G-specific visualization capabilities (well maps, cross-plots, engineering charts).
- •You can afford the license cost and the learning curve investment.
- •You already have Spotfire deployed and engineers depend on it (migration cost is high).
Choose Grafana When:
- •You need true real-time operational monitoring (sub-minute refresh).
- •You are monitoring infrastructure (SCADA, field equipment, network) rather than doing business analytics.
- •You have DevOps or data engineering staff comfortable with open-source tools.
- •You want to supplement Power BI or Spotfire with a real-time layer at zero license cost.
Choose Tableau When:
- •You have a corporate mandate from outside the E&P business unit.
- •Your use case is genuinely business analytics (finance, HR, corporate reporting) rather than engineering.
- •Your company has an existing Salesforce relationship that includes Tableau licenses.
- •You are in midstream or marketing rather than upstream operations.
The Recommended Stack for Most Mid-Size Operators:
Power BI (primary) → Morning reports, KPI dashboards, executive reporting, company-wide distribution Spotfire (5-10 licenses) → Engineering analysis, reservoir work, decline curves, ad-hoc exploration Grafana (free) → Real-time SCADA monitoring, operational alerts, well alarm dashboards Excel (inevitable) → Economics, quick analysis, everything else
This four-tool combination costs roughly $15,000-$40,000 per year in BI licensing and covers every analytics use case an upstream operator encounters. It is not the simplest answer. It is the honest one.
What This Means for Your Data Architecture
The BI tool decision does not happen in isolation. It is downstream of every other architectural choice: your cloud provider, your data platform, your historian, your SCADA system. If you are building a modern data stack -- moving from on-prem SQL Server and manual morning reports to cloud-based analytics and automated insights -- the BI layer is the last piece, not the first.
The common mistake is starting the data modernization journey by buying a BI tool. The correct sequence is:
- 1.Fix the data plumbing (get SCADA and historian data into a queryable, reliable data store).
- 2.Build the data model (clean, transform, and organize data so it is BI-ready).
- 3.Choose the BI tool (now you know what you need because you know what data you have).
For a deeper look at how digital platforms and AI fit into the overall upstream technology stack, see our guide to digital platforms and AI in upstream oil and gas. For the specific design principles that make production dashboards succeed or fail, see our production dashboard design guide.
The BI tool is the part of the data stack that users actually see. Get it right, and your engineers will use it every morning. Get it wrong, and you have another expensive dashboard that nobody opens.
Need help choosing or implementing BI tools for your E&P operations? Get in touch.