Interview with Abhay Paroha, Interview with Sumit Bhatia, IT Solutions Architect

Interview with Abhay Paroha, Senior Software Engineer and Software Team Leader

Texas Mutual

Emmanuel Sullivan: Tell us a little bit about your current (or latest) position and what you do, as well as how you found a job.

Abhay Paroha: I am currently a Senior Software Engineer and Software Team Leader for one of the world’s largest oilfield services companies. In this role, I apply my expertise in computer science,

Abhay Paroha
Abhay Paroha

software engineering, project management, and risk mitigation to lead new product development for next-generation production assurance software and solutions. I have been responsible for multiple patented new technologies that improve efficiency in production optimization using cloud computing and machine learning. I am also responsible for using agile methodologies to oversee complex, multimillion-dollar oil and gas (O&G) engineering and technology projects impacting customers worldwide. My position leverages my skills in software engineering, e.g., polyglot programming and system design, in developing many software features, as well as my business and leadership skills, which are necessary to lead teams and keep people and projects on time and schedule.

I earned my bachelor’s degree in computer science and engineering in 2007 from Jabalpur Engineering College, the oldest technical institute in central India, and my master’s degree in software systems in 2013 from Birla Institute of Technology & Science, Pilani, India. My master’s dissertation was on a project related to drilling data aggregation, i.e., Real Time Data Service (RTDS), a vendor-neutral data aggregation and delivery service providing operators with all the data generated at the rig in a single package without the complexities associated. RTDS provides the technology and services across the value chain, from the rig to the interpretation applications, which maximize the value of that data. These real-time drilling operations services enable a new generation of petro-technical workflows, including sophisticated models that can receive and integrate real-time data streams. A drilling optimization workflow can then serve as the foundation for a sophisticated geo-mechanics model that can receive real-time data. Using this model, down-hole risks identified in the planning stage can be managed. Since the model is continuously updated in real-time, risks can be mitigated as operations progress.

My early engineering endeavors commenced at a multinational located in Noida, India, where I served as a software engineer for three years. Within this role, I was responsible for building web applications for a banking and finance domain, and I built a credit card scoring application for the largest credit card reporting company in the UK and a mortgage application for credit unions in the USA. My O&G career journey started in 2010 when I joined the production enhancement software team of the world’s second-largest company at that time. In this role, I was responsible for the software development of a product for complex fracture modeling using real-time geo-seismic data. I implemented scenarios for simulating real-time communication of micro-seismic event data from the oil field to the office.

With this strong technical background, I became a mid-level software engineer with my current employer in 2012, where I have built my nearly 12-year career, which has included working in the drilling and production domain. In the drilling domain, I was involved in developing a real-time data aggregator desktop application to ingest and aggregate real-time drilling data and to enable drilling workflows to reduce risk and optimize well construction. It collects multiple data types at rig locations and is easily extended to support several drilling data formats. I also developed a WITSML (Wellsite Information Transfer Standard Markup Language) server to store, manage, and provide access to well-site data in real time. It enabled interoperability between drilling rigs, downhole tools, surface equipment, data analysis software, and other components of the drilling and production process. In the production domain, I worked on a Production Data Management System (PDMS). This scalable and configurable production management solution enables customers to more effectively & safely manage production operations to deliver maximum value for all types of assets, including onshore, offshore, conventional, or unconventional. I worked with a team of production operations engineers and developed a web application to manage the asset operations of one of the major oil companies in Southeast Asia. Later, I played a key role in starting the digital transformation journey of the production assurance domain in 2016. I got opportunities to solve complex challenges to build a cloud-hosted production optimization platform. A few of the major challenges I solved related to ingesting and serving operational production data at scale, a distributed calculation engine to provide recommendations to optimize oil production, multi-tenant Software as a Service (SaaS) applications, e.g., machine learning-based well intervention candidates ranking system and well health monitoring solutions.

ES: What inspired you to start a career in the oil and gas industry?

AP: It has been 14 years since I joined the O&G industry. Many motivating factors attract individuals to select this industry. A few key factors inspired me to pursue my software engineering career in O&G. First and foremost are technological challenges, such as optimizing extraction processes, managing vast amounts of data, reservoir modeling, asset management, production optimization, or developing innovative solutions for exploration and production. The industry is continually evolving, embracing new technologies such as cloud computing, artificial intelligence, data analytics, and automation. As a software engineer, I wanted to implement innovative solutions that have a global impact and solve real engineering challenges.

Another factor is global reach and interdisciplinary collaboration. The O&G industry operates on a global scale, providing opportunities for engineers to work on projects around the world and collaborate with diverse teams from different cultures and backgrounds. I often work closely with geoscientists, petroleum engineers, data analysts, and other professionals. This interdisciplinary collaboration fosters a dynamic work environment where individuals can learn from experts in various fields and contribute their expertise to integrated solutions.

ES: How are some of the technology innovations you’ve developed improving the effectiveness of the production operations? 

AP: Many operating companies are using digital oilfield applications to optimize oil and gas production in real time. However, these applications often use traditional on-premises systems, which face challenges in maintenance, accessibility, and scalability. Additionally, the sector still faces challenges in data integration from diverse sources, the liberation of consolidated data for consumption, and cross-domain workflow orchestration. Digital transformation strategies have led to the rise of Cloud-based solutions in the oil and gas industry due to their accessibility, customization, system stability, and scalability, which can handle larger data volumes efficiently.

To address the challenges mentioned earlier, I delivered multiple patented technologies that resulted in a commercial data foundation solution to serve operational production data at scale. This cloud-hosted data foundation provides the underlying infrastructure, services, interfaces required to support and unify production data ingestion, and workflow orchestration, and through the alignment of the common domain and digital concepts, improve collaboration between people in distinct roles, such as production engineers, reservoir engineers, drilling engineers, deployment engineers, software developers, data scientists, architects, and subject matter experts(SME) working with production operations products and solutions.

Production operations are complex, involving various business roles, geographies, and workflows, often utilizing a variety of software applications and tools. Data collection in these operations can become problematic if not managed properly. Challenges include the collection and availability of data from various sources, the frequency of data, the use of various conventions and standards in software, and the maintenance, accessibility, performance, and scalability of on-premise software applications. These challenges can lead to significant impairment of data value extraction, such as non-productive time, data quality issues, inconsistencies in data-driven workflows, missing or incomplete data, and non-productive time when data is not utilized.

Production operations rely on cost reduction to remain competitive and enhance production. To achieve these goals, existing infrastructure should be utilized rather than new investments. Technological advancements and industry efficiency demand smarter solutions that cover both infrastructure and digital capabilities. A data-centric solution, like cloud hosted data foundation, helps manage diverse data sources for faster decision-making. This solution includes integrated analytics and machine learning, enabling greater access and consumption in a continuously enriched context. My patented work is divided into three areas: data ingestion, production domain model, and scaling consumption workflows.

ES: How does your data ingestion work effectively to manage the diverse array of data sources inherent in production operations to ensure a robust data-centric foundation?

AP: A reliable production operations advisory software relies on a robust data-centric foundation, especially when dealing with production operational data. Common data sources in production operations include Production Data Management Solutions (PDMS), corporate historians, edge devices, manual data entries, spreadsheets, and CSV files, and calculated data via simulations or physical models. Capturing and continuously integrating data from these sources is a challenge, as employees typically spend around 80% of their time looking for and unifying data sources.

I proposed the idea of autonomous agents to solve the data ingestion problem. An agent is a piece of software operating near a data source, used to push data into a cloud environment for usage in production operations workflows. Agents are integrating time-series and structural data from various sources into a single data storage system based on industry standards. It supports diverse data types and frequencies, enabling data contextualization with the production domain model. This enables automatic historical ingestion or incremental changes ingestion, increasing user confidence and streamlining data consumption in the way domain workflows require it.

Another significant concern is data security during data flow from a source system to cloud-based storage. The agent framework provides data security during its flow, either through on-premise deployment of data adaptors or a cloud-based connector, by authorizing the connection through cloud service accounts and ensuring data pipeline encryption. The agent supports either push or pull-based data ingestion mechanisms, increasing data flow efficiency and reducing latency, allowing applications or workflows to generate results on the fly with the latest available data.

ES: How to model cloud-hosted operational production data using industry standards? 

AP: An efficient operational production data ingestion architecture must be paired with a uniform schema that is comprehensible to all applications, workflows, solutions, and users. The operational production data domain model serves as a universal language to transfer the specifics of source systems to the industry-adopted domain concepts.

I invented a bi-temporal data domain model to store high-frequency time series and structural data gathered from disparate data sources. The domain model is a canonical model that represents a collection of entities, related properties, and relationships that can be linked together following different rules and validations. It supports fields, assets, surface and subsurface equipment, wells, boreholes, and combinations, among other types of entities widely used in the industry. The model allows the unification of data from different data sources and represents a model for both raw and calculated data. It leverages decades of proven understanding of domain data models used in various applications related to the production domain and combines them to formulate multiple concepts supporting different verticals of the production domain, from upstream to midstream. Three major sections of the data model include entities, which represent objects with physical or virtual boundaries, associated properties, and relationships, which represent hierarchical linking between entities based on how a consumer wants to use them. For example, one user may use a relationship of entities under their organization structure, while another user is more interested in technical workflows and equipment relationships.

The consistency of the data and performance are two major problems with old methods. Also, the legal and audit implications of data storage are just as significant as any other component controlling data within a system. To address that, the data storage has two different time dimensions: the application time, which denotes the actual physical time of measurement, and the storage time, which denotes the moment at which a data point is saved in the storage. Bi-temporality is the term used to describe this idea. Because of the immutability feature, the bi-temporality of structural and time-series data contributes to consistency, repeatability, and the ability to carry out long-running computations.

ES: What are ways to consume cloud-hosted operational production data to enable production operations workflows? 

AP: The data flow journey involves more than just ingestion and storage. It requires conditioning, transformation, and preparation for workflows using reliable and performant consumption services. The way use cases and solutions are expressed and delivered has changed in the digital transformation era. Efficiency and quality are crucial factors for the successful delivery of requirements in this data flow journey.

The consumption and workflow services in production operations workflow enable users to identify and retrieve data efficiently. These services include searching for entity properties, traversing relationships, calculating, aggregating, and writing back time-series data, applying data quality attributes, and consuming data. I applied microservices architecture to write calculation workflow services to ensure scalability and efficient distribution of these services across the system, ensuring a fast, reliable, and scalable process. Any missing part can hinder the entire solution, making these features crucial for efficient production operations workflows. I proposed a unique patented change journal algorithm to detect changes in time-series data collected from different data sources. This change journal algorithm helped to scale calculation workflow services for up to 20,000 oil wells. This algorithm receives data from a source; detects a change in the data; generates an aggregate change journal based on the change; and provides access to information in the aggregate change journal by a computational framework that consumes the data in a time-dependent manner.

ES: What kind of production operations workflows are enabled by these innovations?

AP: These inventions enable several cloud-hosted production operations workflows, e.g., a well performance application providing consolidated production overview, well surveillance for producers, injectors, and disposals, well intervention management, production forecasting, model management, and analytics for production optimization and process facility operations. A well performance application is focused on the key value activities to operate an asset through management by exception. Well portfolio optimization is a comprehensive opportunity management system to mature production enhancement opportunities from candidate recognition to job execution. The inclusion of past intervention best practices and lessons learned into this system predicts the chance of success for every intervention job. Engineers are now able to spend more time maturing production enhancement opportunities instead of looking for them in the first place.

Other applications such as production forecasting provide engineers with a continuous understanding of targets and planning through an automated, evergreen pipeline of forecasts. Model management ensures a seamless, automated way of calibrating and updating models for use in production-related workflows. Process facility operations application focuses on surface equipment surveillance related to oil, water, and gas fluid handling. It also helps engineers to perform equipment prognostics health monitoring (PHM) analytics. Through cloud-hosted operational production data foundation, all these applications help operators turn their production data into valuable insights in the cloud space, therefore allowing them to make timely decisions in production optimization and maximizing the true value chain of production operations.

Agents can import large volumes of data sets quickly, which makes it possible to build and train machine learning models and save the results back into the data foundation for later use. Additionally, by utilizing an agile methodology and microservices architecture expedites the delivery of new features and capabilities and aids in the solution’s flexible scaling. The goal of releasing and organizing data not only addresses current business problems but also facilitates the exploration of previously untapped markets.

ES: What main technical skills do you think will be needed to advance in the O&G industry as a Software Engineer? 

Software engineers in the oil and gas industry play a crucial role in driving innovation, improving operational efficiency, and enabling data-driven decision-making across various aspects of the industry. Software engineers can develop applications for geoscience and reservoir engineering, drilling and well engineering, production and operations, data analytics and machine learning, health, safety, and environment (HSE), and asset management. These applications help analyze seismic data, model reservoirs, optimize production, monitor drilling operations, analyze well data, and optimize processes. They can also create applications for monitoring and controlling production operations, automating processes, and enhancing efficiency. They can also develop algorithms and models for analyzing large datasets, predicting equipment failures, optimizing production processes, and identifying cost-reduction opportunities. Additionally, they can develop asset management systems to track and maintain infrastructure, optimize performance, and extend asset life. All applications use basics to advance concepts of computer science and software engineering.

Young software professionals should focus on mastering technical knowledge and expertise, staying updated with the latest technologies, and adapting to industry trends. The energy industry has embraced digitalization to improve efficiency and profitability, requiring engineers to broaden their skill sets. This can be achieved through attending training sessions, conferences, and pursuing certifications in specialized areas like cloud, machine learning, Internet Of Things (IoT), and artificial intelligence. Continuous learning is crucial for a successful career in the ever-evolving energy industry, and opportunities for professional development are abundant.

Headline image courtesy of Abhay Paroha.

Author Profile
Emmanuel Sullivan
CEO/Publisher - 

The CEO of U.S. Energy Media, Emmanuel Sullivan is a technical writer who has built up his profile in the oil and gas industry. He lives and works in Houston, where he publishes Oilman and Oilwoman on a bimonthly basis, and Energies quarterly, distributing the magazine to energy thought leaders and professionals throughout the United States and around the world. At a time when technology is rapidly changing, he provides an invaluable service to oil & gas, and renewable energy executives, engineers, and managers, offering them both broad and specific looks at the topics that affect their livelihoods. Sullivan earned his BA in Communications at Thomas Edison State University and his MA in Professional Writing at Chatham University. 

3 Ways Technology is Going to Shape the Oil and Gas Industry Free to Download Today

Oil and gas operations are commonly found in remote locations far from company headquarters. Now, it's possible to monitor pump operations, collate and analyze seismic data, and track employees around the world from almost anywhere. Whether employees are in the office or in the field, the internet and related applications enable a greater multidirectional flow of information – and control – than ever before.

Related posts