QUESTIONS REMAIN AS INDUSTRY SEEKS INTEGRATED, STANDARDIZED WORKPLACE

March 14, 1994
J.W. King Simon Petroleum Technology Marlow, U.K. The challenge for exploration and production technology in the 1990s is to support business activities of an industry that must function in an environment of continual change. Since the oil price crisis of the 1980s, oil and gas companies large and small have seen the need to reevaluate work done to explore for, develop, and manage their hydrocarbon assets. More important than this are the changes in working practice adopted by many. Approaches

J.W. King
Simon Petroleum Technology
Marlow, U.K.

The challenge for exploration and production technology in the 1990s is to support business activities of an industry that must function in an environment of continual change.

Since the oil price crisis of the 1980s, oil and gas companies large and small have seen the need to reevaluate work done to explore for, develop, and manage their hydrocarbon assets.

More important than this are the changes in working practice adopted by many. Approaches to getting the job done have changed fundamentally.

These changes shape the technological challenge by creating two important pressures. One is growth in the volume of data to be collected, analyzed, and maintained. The other is reduction in the number of people who perform the work.

Oil companies react to the pressures of shrinking staffs in three general wales:

  • Don't do the work at all. Many oil and gas fields have historically been annualized for the sake of analysis. Full field models are monitored and extensive simulation studies conducted for little net benefit. There are moves to abandon studies of this form and to avoid expensive and time-consuming equity studies.

  • Share data, The idea here is to provide both raw and interpreted data to other companies working within the same group on a particular asset. Companies may then accept the operator's model with little or no reinterpretation. Several initiatives to establish industry data centers currently exist.

  • Revert to core ... or only manage those aspects that are directly part of the business. The management of information services functions, software development, and other routine services is bought from external service providers. One approach involves straightforward subcontracting of an activity. The other approach involves the creation of a multi-oil company coverture in which the contractor may be able to provide further cost savings through economies of scale. Within both approaches, leveraging of the market by the oil companies is aimed at creating keen competition.

In each instance, managers are demanding of their staff that the risks of any particular development plan be thoroughly assessed through the consideration of a range of sensitivity cases.

Technology is beginning to be able to respond to these pressures. Sweeping changes to hardware procurement were introduced in the middle to late 1980s through the viability of client/server computing. The introduction of personal computer (PC), Macintosh, and Unix networks of workstations has brought dedicated processing power within the grasp of individual users. The evolution toward larger memory chips, faster and larger disks, and much more powerful central processing units (PCUs) at costs has opened the way for very rapid expansion of the range of sophisticated software applications.

Standardization within the computer industry as a whole has provided a raft of tools for the development of applications with a common look and feel-through X-Windows and Motif. User expectations have increased in parallel with this.

No longer is the geophysicists or the simulation engineer the only user needing access to powerful software technology. All users can covet access to productivity tools, which may enhance their performance by anything up to five-fold.

The question is how to harness the technology to provide the tools while attending to the need to protect ongoing capital investment.

DEVELOPING SOLUTIONS

There are three threads to solutions developing in the industry:

  • Integration.

    Improved data management and usage are the way forward for technological advances. These, if implemented properly, improve the interpretational environments in which users operate, which in turn increase efficiency and enhance the quality of work.

    The approach, explored since the late 1980s, combines the provisions of broad-based project data bases with purpose-built software applications (Fig. 1). Several such systems have appeared in one form or another, with different emphases on the data base aspects, the application aspects, or the integration aspects themselves.

    The scope of the data base will vary. To be effective in today's operating environment, especially where complete production seismology is used, it is necessary for the project data base to allow for a range of data from the storage of multiple (overlapping) seismic survey data at one extreme, through the reservoir modeling and simulation data at the other.

    Applications must similarly be very broad in scope. Because of the change in working practice, it does not necessarily follow that each application must be excellent in its own right. There is a growing recognition of the 80:20 rule, which states that 80% of the users employ only 20% of the functionality.

    The project data base may be field-oriented in a production department. It is much more likely to be a basin or regional scale data base in an exploration context. Whatever the geographical extent and however large the volume of data, it is inevitably the case that to be effective, project building and reporting-data transfer between the project data base and other data stores, whether they be corporate data bases, other project data bases, or specific application or proprietary data stores-are essential. Many vendors address this problem specifically.

  • Standards.

    However, the goal of a standard data model aimed at producing compatible data bases is in sight. Work by the Public Petroleum Data Model Association (PPDM) and the Petrotechnical Open Software Corp. (POSC) is being consolidated to produce the first such data model. It will be some time before this is mature enough to bring into full operational use, but the progress is there for all to see.

    The establishment of such a standard is a key objective for the 1990s.

  • Interoperability.

    When there is a data model acknowledged by all to be both mature and stable, the way will be open for a deeper level of interaction between applications from different suppliers.

    Currently baited to data exchanges via an exchange protocol, it is to be expected that full interoperability of applications will become the goal. By this, users will expect applications from different vendors to share the same project data bases (Fig. 2).

KEY ISSUES

How integrated is integrated? How good is the standard? What will it all cost? Are there suppliers in a position to supply?

These are some of the issues that arise when considering these heralded solutions.

There is no doubt that some of the products currently available either realize or come very close to realizing the objectives of an integrated solution. The eventual choice of supplier may well rest on the emphasis within the user community requiring the solution.

A more significant issue is the acceptance that an 80% or 90% solution is sufficient for the vast majority of interpretational tasks. No user ever feels comfortable with anything other than "the best." However, those who try it usually find the benefits far outweigh any shortcomings.

In considering standards, two big issues come to light: viability and comprehensiveness.

Viability relates to maturity and stability. A standard must have reached a point at which further changes are not architectural but much more peripheral in nature. Users of the model must have confidence that work done is safeguarded from immediate obsolescence through major changes. The POSC standard has not reached this stage yet and may not do so until some way through 1995.

A standard's comprehensiveness encompasses both the content and the clarity with which concepts are described and defined. It is very important, for instance, that there be no gaps in the definitions; gaps promote extensions that, if carried out externally to the standards process, lead to incompatibilities.

Equally important is the need to ensure that there are no ambiguities, which leave open the prospect of different interpretations of the standard, leading to a breakdown in interoperability.

The POSC model is in the early stage of evaluation through a number of pilot projects. Already, some conclusions in this area are beginning to emerge. Initial findings indicate that there will be a continuing requirement for refinement of the first versions of the model for some time to come.

COST ISSUES

The costs of introduction of these new technologies can be categorized as those associated with migrating the applications, migrating the data, reequipping with new hardware, and extending staff skills (Fig. 3). The first two of these costs may be particularly high.

At present, there are essentially no application software solutions, integrated or otherwise, that operate within the POSC environment. Therefore, all applications required to be POSC-compliant will need to be migrated by their owners, with the cost of migration being recovered through future upgrade and license fees.

The cost is far from trivial. POSC familiarization, software reengineering, and testing will cost each vendor potentially millions of dollars. These are incremental costs because existing products must be maintained in parallel.

There will also be costs associated with the withdrawal of previous versions from the marketplace and their replacement by new compliant versions.

The strategy for data migration will vary from company to company. Some may decide that the cost of dearchiving data, converting them to new data base formats, and rearchiving is prohibitive. They will therefore decide not to undertake the task.

However, if archived data are to be useful and the full benefits of standardization enjoyed, then the cost will have to be borne at some point. The management of this process may run into tens of millions of dollars.

Finally, it is probable that more hardware for either server or client support will be required fully to implement the new technology. Few companies have sufficient hardware resources at this time.

Quite apart from the hardware costs, investment will be required to ensure that users fully exploit the advantages of new technologies. This will cover all aspects of the skills life cycle, from initial training through advanced user training.

SUPPLIERS' CHALLENGES

Suppliers are caught in a cleft stick. On one side there is acknowledgment that migration toward the heralded solutions is the way forward. On the other, it is far too soon to be sure that a market exists at a level that would justify the expenditure implied.

In addition, there is an implied change in market forces, with standardization and interoperability apparently destined to open the market and force down prices through leveraging.

At present, vendor profitability is generally marginal, and it is hard to see how the same level of service and research and development can be supplied for a cost that, industrywide, is reduced. Indeed, it is likely that as the market becomes more competitive the large companies may fare somewhat better in the medium term, thereby reducing choice and flexibility.

Outsourcing is a policy that has been expounded for several years. To date, few significant opportunities have become available in the software arena.

The buy-not-build option as an alternative to internal R&D has tended to result in preservation of the status quo in many companies. Vendors are undoubtedly ready to begin to supply. But is the opportunity really there today?

THE FUTURE

There is some way to go before the dream becomes reality. However, rapid progress is being made. By the turn of the century, there probably will have been several years of use of standards-compliant, integrated solutions.

What can be done to hasten the day? From the viewpoint of a supplier involved as both a contributor to the standards process and an early evaluator of the products of that process, the clearest conclusion is that partnership is required.

Partnership between vendors as suppliers and oil companies as consumers, aimed at bringing information from practical experience back into the standards process, is the only way to ensure that the whole process converges on a realistic and viable solution.

Oil company involvement and funding is fundamental, for by them the environment is established and sustained in which solutions can emerge, evolve, and be proved and validated. Oil companies can provide the test data and infrastructure necessary thoroughly to test any solution as well as the basic funding necessary to kick-start the process.

Partnerships in which a limited number of parties come together to address specific objectives provide the catalyst through which confidence will grow within the supply side of the industry that there really is a market. Through these partnerships, the whole process of application and data migration can be brought together and coordinated.

The final contribution the oil companies bring through partnerships is the sustaining of momentum as the process builds up through pilot and other exploratory projects.

The prospects for the future are good, the way forward being through strategic partnerships and alliances. There is every hope that new technologies will be in widespread use by the end of 1996. Let us hope this is the case for the good of the industry as a whole.

Copyright 1994 Oil & Gas Journal. All Rights Reserved.