Home » Project Material » Design And Implementation Of Computerized Information Management System In Seismic Data Processing

Design And Implementation Of Computerized Information Management System In Seismic Data Processing

(A Case Study Of Integrated Data Services Limited Idsl Benin-City)

5 Chapters
|
54 Pages
|
7,110 Words

A Computerized Information Management System (CIMS) in seismic data processing refers to a sophisticated framework designed to streamline the organization, storage, retrieval, and analysis of seismic data obtained from various subsurface layers of the Earth. This system plays a crucial role in the exploration and understanding of the Earth’s structure, employing advanced algorithms and computational techniques to process seismic signals. It integrates diverse components, such as data acquisition, storage infrastructure, and analytical tools, to efficiently handle the vast and complex datasets generated during seismic surveys. The seismic data Computerized Information Management System significantly enhances the efficiency of seismic interpretation by providing a centralized platform for data management, facilitating collaborative efforts among geoscientists and researchers. Moreover, it ensures data integrity, accessibility, and accuracy, thereby contributing to informed decision-making in the exploration and extraction of subsurface resources.

ABSTRACT

This project work focuses on the computerization of information Management Seismic Data Processing system to increase services and enhancement of Oil producing Industries.

The purpose of these study is to ensure good seismic data in effective manner, ensure that all field tapes collected from the field are sent to the processing centre and it’s seismic data tapes are presented in a form that is convenient for geological interpretation, ensure that processing sequence are followed and a good result suitable for acceptance etc.

The problem of this study is how can this information collected from the field be presented and protected from destruction. Tape librarians were introduced to help manage the tapes, but as time goes on, the member of tapes acquired becomes, so large in number that the librarians cannot manage them effectively.

The solutions to the problem lies in removing any factor inhibit the ability of the librarian to make the storage and retrieval of the tape easy in the organization.

 

TABLE OF CONTENT

Title Page
Certification
Dedication
Acknowledge
Abstract
Table of Content

CHAPTER ONE
Introduction 1-2
1.1 Problem of the study 3
1.2 Purpose of the study 4
1.3 Scope of the study 4
1.4 Limitation 4

CHAPTER TWO
Literature Review 5-9

CHAPTER THREE
Description and Analysis of the existing and new system Design. 10
3.1 Fact finding method (oral) 10-11
3.2 Organization Structure 11-12
3.3 Objective of the existing system 13
3.4 Problems of the existing system 14
3.5 Justification 14-15
3.6 Input, Process, Output analysis 15-20
3.7 Design of New System 21
3.8 Output Specification and design 22
3.9 Input Specification and design 22-24
3.10 File design 24
3.11 Procedure chart 24-25
3.12 System Flowchart 26
3.13 System Requirement 27-28

CHAPTER FOUR
Implementation 29
4.1 Program design 30
4.2 Program Flowchart 30-33
4.3 Test run 34
4.4 Documentation 35-40

CHAPTER FIVE
5.1 Recommendation 41-42
5.2 Conclusion 43-44
Definition of term
Reference

CHAPTER ONE

INTRODUCTION
The exploration activities involve the acquisition of data and processing. As exploration activities increases in the modern world, the computer manufacturers are also making hard produce super computer and their processing also: Thus with high computing power to cope with the larger data processing that are involved.

Owing to the inevitable roles that are played by the oil industries in the modern world, that Federal Government recently list out a white paper enforcing all oil companies to do more exploration to increase oil reservoir. This means that there will be more data acquisition and processing jobs in the near future.

The importance of oil exploration in modern countries can never be over emphasized. In the United States of America and Britain. Intensive exploration activities are on for the last two decades all in an attempt to have enough oil reserves.
And so seismic data processing in very indispensable in the oil industry because it is the step taken in the discoing of hydrocarbons or better more seismic data processing is a key sequence in search for hydrocarbon.
In the processing of seismic data, cares should be taken because not properly processed, it might lead to wrong interpretation and this might lead to false prospect. In fact, wrong processing will reduce the success ration in the discovery of hydrocarbons and this causes the cost to drilling to increases. In essence, wrong processing will lead to drilling of false prospect and we know that it costs millions if not billions of Naira to drill a well.

The role of the Government in oil industries had gradually progressed from a regulatory one to direct involvement in all exploitation and exploration. Initially, Government only made rudimentary laws and collected royalties for the oil companies. By 1971, the situation changed as oil became more important to the Nigeria Economy Oil accounts for about 90% of the total foreign exchange earning for the country.

In essence, Nigeria that wants increase in her revenue must engage in acquisition, processing and management of seismic data. Seismic data management system is therefore a software development for the purpose. That is why the case study was the integrated Data Services Limited (IDSL) Benin City a subsidiary of the Nigerian National Petroleum Cooperation (NNPC).

1.1 PROBLEMS OF THE STUDY
In the search for hydrocarbons, the initial question had always been where exactly under the surface are the anticlines or structures within which hydrocarbon can be accumulated.

This question is answered in part by seismic data acquisition and in the main by seismic data acquisition and in the main by seismic data processing. Acquisition semen the raw data while processing transforms and refines these data to a level where the above question is answered.

The problem is how can this information collected from the field be presented and protected from destruction. That is why the tape librarians introduced to the help in managing these tapes, but as time goes on, the number that the librarians cannot manage them effectively. This has created many problems for the management.

The solution to the problem will lie in removing and factors that inhabits the ability of the librarians to make the storage and retrieval of tapes easy in the organization.

1.2 PURPOSE OF THE STUDY
This research will be very useful to both students, individuals, cooperate bodies and general public as a whole who are either engaged or way wish to engage in seismic data acquisition and processing in the future.
The purposes are as follows
1. Ensuring the acquisition of good quality seismic data in an effective manner.
2. Ensuring that all field tapes collected from the field are sent to the processing centre and its seismic data in the tapes are presented in a form that is convenient for geological interpretations.
3. To ensure that processing sequence are followed and a good result suitable for acceptance.
4. Ensuring that processed seismic section which is the final output of a processed field appraised development.

1.3 SCOPE OF THE STUDY
In this research work, the study was limited to the use of Electronic Data Processing (EDP) on the information management control in Seismic Data Processing.

1.4 LIMITATION
Due to the work load, enough research could not be made to other known seismic Data Acquisition and Processing Organization. But hope that little information obtained in applicable to all other Seismic Data Organization.

 

 

SIMILAR PROJECT TOPICS:
Save/Share This On Social Media:
MORE DESCRIPTION:

Computerized Information Management System In Seismic Data Processing:

A Computerized Information Management System (CIMS) plays a crucial role in seismic data processing, which is an essential part of the oil and gas exploration industry. Seismic data processing involves the collection, analysis, and interpretation of seismic data to create subsurface images of the Earth’s structure. Here’s how a CIMS can be used in seismic data processing:

  1. Data Acquisition and Storage:
    • Seismic data is collected using geophones or seismometers, which record ground motion caused by controlled explosions or vibrations. These sensors generate vast amounts of data.
    • A CIMS is used to store and manage this raw data efficiently. It includes data storage solutions that can handle large volumes of seismic data, such as network-attached storage (NAS) or cloud-based storage.
  2. Data Quality Control:
    • Before processing, seismic data needs quality control to remove noise and correct any irregularities.
    • A CIMS can automate the quality control process by implementing algorithms that identify and flag problematic data points.
  3. Pre-processing:
    • Seismic data often undergoes pre-processing, which includes tasks like noise removal, trace editing, and time-to-depth conversion.
    • The CIMS can automate these pre-processing steps to save time and ensure consistency.
  4. Parallel Processing:
    • Seismic data processing is computationally intensive and can benefit from parallel processing on high-performance computing clusters.
    • A CIMS can manage the distribution of processing tasks across multiple computing nodes, optimizing resource utilization.
  5. Workflow Management:
    • Seismic data processing involves a series of interconnected tasks and workflows.
    • A CIMS can facilitate workflow management by tracking the progress of individual processing steps, handling dependencies, and providing a user-friendly interface for configuring and monitoring workflows.
  6. Data Integration:
    • In addition to seismic data, other geospatial and geological data, such as well logs and geological models, may be integrated into the analysis.
    • A CIMS can provide tools for integrating, visualizing, and analyzing various types of data to enhance the interpretation of seismic results.
  7. Data Archiving and Retrieval:
    • Processed seismic data, interpretation results, and associated metadata need to be archived for future reference.
    • A CIMS can manage data archiving and retrieval, ensuring that historical data remains accessible.
  8. Collaboration:
    • Seismic data processing often involves collaboration among geoscientists, engineers, and data analysts.
    • A CIMS can provide collaboration tools, such as shared workspaces and version control, to facilitate teamwork and data sharing.
  9. Reporting and Visualization:
    • A CIMS can generate reports and visualizations to communicate results effectively to stakeholders.
    • It may include tools for creating 2D and 3D seismic images, cross-sections, and other visualizations.
  10. Security and Compliance:
    • Seismic data is valuable and sensitive, so a CIMS must implement robust security measures to protect data integrity and confidentiality.
    • Compliance with industry regulations and standards is also essential, and the system should support audit trails and data traceability.

In summary, a Computerized Information Management System is essential in seismic data processing to efficiently handle the vast amounts of data involved, automate processing tasks, manage workflows, ensure data integrity, and facilitate collaboration among experts. It plays a critical role in improving the accuracy and efficiency of subsurface imaging and geological interpretation in the oil and gas industry.