top of page

Results found for ""

  • Upgrade Oracle Golden Gate 12cR1 to 12cR2

    Upgrade Oracle Golden Gate 12cR1 to 12cR2 In this article, we will look at Oracle Golden Gate upgrade from 12.1 to 12.2 on a Linux server. Make sure you Allow golden gate processes to finish processing all current DML and DDL data in GG trail files. Golden Gate Upgrade Prerequisite Perform Golden Gate upgrade If upgrading DB along with GG, upgrade DB first Golden Gate Upgrade Prerequisite Stop user activity on source objects which are involved in replication Stop Extract GGSCI> lag ext1 GGSCI> stop ext1 Make sure replicat applies all transactions on target GGSCI> lag rep1 Stop pump and Replicat GGSCI> stop dp1 GGSCI> stop rep1 Stop Manager on both source and target GGSCI> stop mgr Take backup of GG_HOME on both source and target If you want to upgrade source and target DBs, this is the time. Once DBs are upgraded still, make sure there is no activity on source objects involved in the replication Perform Golden Gate upgrade Download the latest 12.2.0.2 Golden Gate and copy the zip file to both source and target DBs Unzip and start the runInstaller Give same old GG_HOME location while installation – Do not worry, this option will auto-upgrade old GG Do not start manager Perform installation on both source and target DBs in the same old GG_HOME location Run @ulg script for supplemental log upgrade from GG_HOME cd $GG_HOME Sqlplus / as sysdba SQL> @ulg.sql  press enter when prompted Rollover Extract, Pump, and Replicat to next trail sequence number On source DB: ============= alter extract ext1 etrollover alter extract dp1 etrollover info ext1 detail ALTER EXTRACT dp1, EXTSEQNO , EXTRBA 0 info dp1 detail On target DB: ============= alter replicat rep1, EXTSEQNO , EXTRBA 0 Start Manager, Extract, Pump, and Replicat This is an oracle known error. You get prvtlmpg.plb script with GG binaries. This resides under $GG_HOME. Run this script on both source and target database cd $GG_HOME Sqlplus / as sysdba SQL> @prvtlmpg.plb --> press enter when prompted Allow activity on source objects involved in the replication. Related Posts Heading 2 Add paragraph text. Click “Edit Text” to customize this theme across your site. You can update and reuse text themes.

  • Check Oracle database size

    Check Oracle database size There are several ways to measure the size of an Oracle database. As an Oracle DBA you may face the requirement to get the current database size. Find below the queries which you can use to find the size of the Oracle database. Check db size – large database Exact database size Analyzing query output Check users & space used Check db size – large database For very big databases where the size run into multiple GB or TB, below command will help you get a bird’s eye view on the database size, used space and free space. Please Note: this query rounds off the output and hence does not show you the exact utilization col "Database Size" format a20 col "Free space" format a20 col "Used space" format a20 select round(sum(used.bytes) / 1024 / 1024 / 1024 ) || ' GB' "Database Size" , round(sum(used.bytes) / 1024 / 1024 / 1024 ) - round(free.p / 1024 / 1024 / 1024) || ' GB' "Used space" , round(free.p / 1024 / 1024 / 1024) || ' GB' "Free space" from (select bytes from v$datafile union all select bytes from v$tempfile union all select bytes from v$log) used , (select sum(bytes) as p from dba_free_space) free group by free.p / Exact database size The size of the database is the space the files physically consume on disk. You can find this with select "Reserved_Space(GB)", "Reserved_Space(GB)" - "Free_Space(GB)" "Used_Space(GB)", "Free_Space(GB)" from( select (select sum(bytes/(1014*1024*1024)) from dba_data_files) "Reserved_Space(GB)", (select sum(bytes/(1024*1024*1024)) from dba_free_space) "Free_Space(GB)" from dual ); Analyzing query output When you run the above query, you will see below output Reserved_Space(GB) Used_Space(GB) Free_Space(GB) ------------------ -------------- -------------- 1.43491124 1.34488439 .090026855 We can see that 1.4 GB is the allocated space across all the data files in the database. Out of the 1.4 GB allocated segments, 1.3 GB is used and 0.09 GB is free space. Check users & space used We can even check the amount of disk space used by users inside the database using below query select owner, sum(bytes)/1024/1024 Size_MB from dba_segments group by owner; Related Posts Heading 2 Add paragraph text. Click “Edit Text” to customize this theme across your site. You can update and reuse text themes.

  • Golden Gate Replication When Table Structure Is Different – COLMAP

    Golden Gate Replication When Table Structure Is Different – COLMAP In this article, we will be using Golden Gate COLMAP parameter for mapping table columns where the tables structure is different. Get ready to set up Golden Gate replication between tables with different structures using defgen and COLMAP ! When Table Structure is Different Golden Gate COLMAP Replication Scenarios Create Definitions File When Table Structure is Semi-Different Create Source Definitions File When Table Structure is Different Golden Gate COLMAP Replication Scenarios We are going to see two things in this activit y Different column names mapping Different column order mapping Below is the mapping order for our tables Create TAB3 table on proddb On Proddb: ========== Conn fox/fox CREATE TABLE tab3 ( one NUMBER PRIMARY KEY, two VARCHAR2(30), three NUMBER, four NUMBER ); On target, create TAB3_DIFFCOL table with different column names completely On Devdb: ========= Conn tom/tom CREATE TABLE tab3_diffcol ( Col_1 NUMBER PRIMARY KEY, Col_2 NUMBER, Col_3 VARCHAR2(30), Col_4 NUMBER ); Connect to database via Golden Gate and add table level supplemental logging On proddb: ========== cd $GG_HOME ./ggsci GGSCI> dblogin userid ogg, password ogg Successfully logged into database. GGSCI> add trandata FOX.TAB3 Logging of supplemental redo data enabled for table FOX.TAB3. TRANDATA for scheduling columns has been added on table 'FOX.TAB3'. Create Definitions File As our source and target tables have different structure, we need to create a source table definitions file and copy it on the target server We use a DEFGEN utility to create definitions file. This utility comes with GG binaries. First, we need to create a parameter file for DEFGEN utility. You can create it via GG prompt or manually via vi editor On proddb: ========== GGSCI> edit params defgen1 DEFSFILE /u01/app/oracle/product/gg/dirdef/FoxTab3Def.def USERID ogg PASSWORD ogg TABLE FOX.TAB3; Exit the GG prompt and initiate defgen utility to generate definitions file On proddb: ========== cd $GG_HOME ./defgen paramfile /u01/app/oracle/product/gg/dirprm/defgen1.prm Copy the definitions file on target server under $GG_HOME/dirdef location On proddb: ========== cd $GG_HOME/dirdef scp FoxTab3Def.def oracle@ggdev:$GG_HOME/dirdef/ Create extract on source On proddb: ========== GGSCI> add extract PFOXT3E, integrated tranlog, begin now GGSCI> register extract PFOXT3E database GGSCI> add exttrail /u01/app/oracle/product/gg/dirdat/t3, extract PFOXT3E GGSCI (ggprod) 3> edit param PFOXT3E EXTRACT PFOXT3E USERID ogg, PASSWORD ogg EXTTRAIL /u01/app/oracle/product/gg/dirdat/t3 TABLE FOX.TAB3; Extract naming convention used: P(prod)FOX(schema)T3(table first & last letter)E(extract) Create data pump process GGSCI> Add extract PFOXT3D, EXTTRAILSOURCE /u01/app/oracle/product/gg/dirdat/t3 GGSCI> Add rmttrail /u01/app/oracle/product/gg/dirdat/r3, extract PFOXT3D GGSCI> edit param PFOXT3D EXTRACT PFOXT3D USERID ogg, PASSWORD ogg RMTHOST ggdev, MGRPORT 7809 RMTTRAIL /u01/app/oracle/product/gg/dirdat/r3 TABLE fox.tab3; Create GG Replicate on target with mapping details of our columns On Devdb: ========== GGSCI> dblogin userid ogg, password ogg GGSCI> add replicat DTOMTLR, integrated exttrail /u01/app/oracle/product/gg/dirdat/r3 GGSCI> edit param DTOMTLR REPLICAT DTOMTLR USERID ogg, PASSWORD ogg SOURCEDEFS /u01/app/oracle/product/gg/dirdef/FoxTab3Def.def MAP fox.tab3 TARGET tom.tab3_diffcol, COLMAP(col_1=one, col_3=two, col_2=three, col_4=four); SOURCEDEFS = specifies the source definitions file location on target server COLMAP = specifies the non-default mapping of columns from source to target Start the Extract, Pump and Replicat process On proddb: ========== GGSCI> start PFOXT3E GGSCI> start PFOXT3D On devdb: ========= GGSCI> start DTOMTLR Let us test our replication On proddb: ========== INSERT INTO tab3 VALUES (1,'Alpha',10,100); INSERT INTO tab3 VALUES (2,'Beta',20,200); INSERT INTO tab3 VALUES (3,'Gamma',30,300); COMMIT; When Table Structure is Semi-Different Semi-different means few columns on target are same as source Create TAB4 table on proddb On Proddb: ========== Conn fox/fox CREATE TABLE tab4 ( one NUMBER PRIMARY KEY, two VARCHAR2(30), three NUMBER, four NUMBER ); On target, create TAB4_DIFFCOL table with semi-different column names On Devdb: ========= Conn tom/tom CREATE TABLE tab4_diffcol ( One NUMBER PRIMARY KEY, two VARCHAR2(30), Col_3 NUMBER, Col_4 NUMBER ); Below is the mapping order for our tables Connect to database via Golden Gate and add table level supplemental logging On proddb: ========== cd $GG_HOME ./ggsci GGSCI> dblogin userid ogg, password ogg Successfully logged into database. GGSCI> add trandata FOX.TAB4 Logging of supplemental redo data enabled for table FOX.TAB4. TRANDATA for scheduling columns has been added on table 'FOX.TAB4'. Create Source Definitions File As our source and target tables have different structure, we need to create a source table definitions file and copy it on the target server We use a DEFGEN utility to create definitions file. This utility comes with GG binaries. First, we need to create a parameter file for DEFGEN utility. You can create it via GG prompt or manually via vi editor On proddb: ========== GGSCI> edit params defgen2 DEFSFILE /u01/app/oracle/product/gg/dirdef/FoxTab4.def USERID ogg PASSWORD ogg TABLE FOX.TAB4; Exit the GG prompt and initiate defgen utility to generate definitions file On proddb: ========== cd $GG_HOME ./defgen paramfile /u01/app/oracle/product/gg/dirprm/defgen2.prm Copy the definitions file on target server under $GG_HOME/dirdef location On proddb: ========== cd $GG_HOME/dirdef scp FoxTab4.def oracle@ggdev:$GG_HOME/dirdef/ Create extract on source On proddb: ========== GGSCI> add extract PFOXT4E, integrated tranlog, begin now GGSCI> register extract PFOXT4E database GGSCI> add exttrail /u01/app/oracle/product/gg/dirdat/t4, extract PFOXT4E GGSCI (ggprod) 3> edit param PFOXT4E EXTRACT PFOXT4E USERID ogg, PASSWORD ogg EXTTRAIL /u01/app/oracle/product/gg/dirdat/t4 TABLE FOX.TAB4; Extract naming convention used: P(prod)FOX(schema)T4(table first & last letter)E(extract) Create data pump process GGSCI> Add extract PFOXT4D, EXTTRAILSOURCE /u01/app/oracle/product/gg/dirdat/t4 GGSCI> Add rmttrail /u01/app/oracle/product/gg/dirdat/r4, extract PFOXT4D GGSCI> edit param PFOXT4D EXTRACT PFOXT4D USERID ogg, PASSWORD ogg RMTHOST ggdev, MGRPORT 7809 RMTTRAIL /u01/ap Create GG Replicate on target with mapping details of our columns. If you notice, we have a situation where we would like to map tables that have few columns on target which are the same as source and few different column names. In this case, we will use a new parameter USEDEFAULTS in the replicat file On Devdb: ========== GGSCI> dblogin userid ogg, password ogg GGSCI> add replicat DTOMT4LR, integrated exttrail /u01/app/oracle/product/gg/dirdat/r4 GGSCI> edit param DTOMT4LR REPLICAT DTOMT4LR USERID ogg, PASSWORD ogg SOURCEDEFS /u01/app/oracle/product/gg/dirdef/FoxTab4.def MAP fox.tab4 TARGET tom.tab4_diffcol, COLMAP (USEDEFAULTS, col_3=three, col_4=four); The USEDEFAULTS keyword specifies that column names are identical between the two tables except where a column mapping has been explicitly defined. In this case, therefore we only need to specify the mapping between COL_3=three and Col_4=four. In the FOX schema in the source database add some rows to the Tab4 table. Start the Extract, Pump and Replicat process On proddb: ========== GGSCI> start PFOXT4E GGSCI> start PFOXT4D On devdb: ========= GGSCI> start DTOMT4LR Let us test our replication On proddb: ========== INSERT INTO tab4 VALUES (1,'Alpha',10,100); INSERT INTO tab4 VALUES (2,'Beta',20,200); INSERT INTO tab4 VALUES (3,'Gamma',30,300); COMMIT; Related Posts Heading 2 Add paragraph text. Click “Edit Text” to customize this theme across your site. You can update and reuse text themes.

  • Find Current Session SID in Oracle

    Find Current Session SID in Oracle Sometimes when you are connected to Oracle database, you might need to find your own session SID and serial number. Below are the two queries that can help you find SID and serial number of current sessions that you connected with Find SID in normal database In a normal standalone database, use below query select sid from v$mystat where rownum=1; Find SID in RAC database In a RAC database, you must also know the details of the instance that you are connected with. Below query will give you SID, serial # along with the instance number SELECT SID, SERIAL#,inst_id FROM GV$SESSION WHERE sid=(select sid from v$mystat where rownum=1); Related Posts Heading 2 Add paragraph text. Click “Edit Text” to customize this theme across your site. You can update and reuse text themes.

  • Hotel Booking Application SQL Project

    Hotel Booking Application SQL Project The goal of this project is to test your SQL skills. You would need to use both technical and analytical skills to solve this project. Before you start, you must have CREATE TABLE, INSERT TABLE privileges. Your goal is to design hotel booking application backend tables which must solve specific issues provided by client. Application Requirements Below are the client's requirements for Hotel Booking application: Guests will be booking hotel rooms via front end application Each booking is recorded under BOOKINGS table There are various types of rooms offered by hotel: Single, Double & Family The amount charged depends on the room type and the number of people staying per night Guests may be charged extras (for breakfast or using the minibar) You are free to assume any other details that are required for Hotel Booking application. Task 1 First build an ER diagram for the Hotel booking application and take approval from the client. Task 2 Implement the ER diagram by building tables in Oracle database. You must also insert some dummy data to show that tables and relations are working fine. Related Posts Heading 2 Add paragraph text. Click “Edit Text” to customize this theme across your site. You can update and reuse text themes.

  • DBA Scripts | DBA Genesis Support

    DBA Scripts Script to create JUSTLEE schema in Oracle Here is the script to create JUSTLEE schema in Oracle Automate RMAN Backups via Catalog Scripts You can take advantage of RMAN catalog scripts to automate RMAN backups on target databases. However, the biggest disadvantage is when... Find Invalid Objects Inside Oracle Changing things in database can cause some objects to become INVALID. Query to check invalid objects in oracle SELECT OWNER, OBJECT_TYPE,... Find SQL Id of the Statement You Just Ran While connected to the database, you might want to know the sql id of the query you just ran (in your own session, not some other... Find Current Session SID in Oracle Sometimes when you are connected to Oracle database, you might need to find your own session SID and serial number. Below are the two... Grant Select on all tables in a schema I encountered this situation where I wanted to grant SELECT on all the tables owned by one user to another user. There are two simple... Find Session Id Running Specific Query At times DBAs need to find or search for session details that are running a specific query inside database. Example, you might want to... Script to create HR schema I was working on a database connected via SQL developer and could not find HR schema. I wanted to create the HR schema using SQL queries.... Check table size in oracle Once you run the query, it will ask your table name. Enter the table name in the format of owner.tablename. Eg – scott.emp select... Find scheduler jobs in oracle The below command will help you check Scheduler jobs that are configured inside database SELECT JOB_NAME, STATE FROM DBA_SCHEDULER_JOBS... Last modified table As a DBA, application team sometimes might ask you to provide details of last modified table in oracle. The table modification can be... Check FRA location utilization Fast Recovery Area must be monitored regularly. Sometimes FRA runs our of space and a DBA must be able to gather FRA space utilization....

  • Oracle Data Guard Protection Modes

    Oracle Data Guard Protection Modes A Data Guard configuration always runs in one of three data protection modes (also called as redo transport rules) By default, the protection mode is MAX PERFORMANCE . If you look above, MAX PERFORMANCE uses ASYNC redo transport and rest other protection modes uses SYNC protection mode. Also, looking at MAX PROTECTION and MAX AVAILABILITY , we can say that the MAX PROTECTION mode is not used in real time. The main reason is if standby unavailable, primary will shut down. The ultimate protection modes you must use are: MAX PERFORMANCE and MAX AVAILABILITY! Switch from Max Performance to Max Availability Protection Mode Verify the broker configuration, check if it’s enabled and make sure log apply is enabled dgmgrl sys/oracle@proddb show configuration show database proddb show database proddb_st edit database proddb_st set state=apply-on; Change LNS mode from ASYN to SYNC EDIT DATABASE proddb_st SET PROPERTY LogXptMode='SYNC'; EDIT CONFIGURATION SET PROTECTION MODE AS MaxAvailability; Switch from Max Availability to Max Performance Protection Mode Verify the broker configuration, check if it’s enabled and make sure log apply is enabled dgmgrl sys/oracle@proddb show configuration show database proddb show database proddb_st edit database proddb_st set state=apply-on; Change LNS mode from ASYN to SYNC EDIT DATABASE proddb_st SET PROPERTY LogXptMode='ASYNC'; EDIT CONFIGURATION SET PROTECTION MODE AS MaxPerformance; Related Posts Heading 2 Add paragraph text. Click “Edit Text” to customize this theme across your site. You can update and reuse text themes.

  • Downloading Oracle 12c Using Linux wget

    Downloading Oracle 12c Using Linux wget When you are working on a Linux server, you do not need to use tools like WINSCP or other file sharing software. You can simply use wget command in linux to download the oracle software. Open internet explorer Linux wget command Open internet explorer Yes, you heard it right! This method does not work for google chrome or other browsers. It does work for Mozilla Firefox though ! Goto oracle.com and login to the website. Navigate to database download page. Accept the terms of agreement and start the download. As your download starts, open the download window (inside internet explorer), pause the download, right click and copy link address Linux wget command Install Linux wget command using root user yum -y install wget To begin the download, issue wget command in below format wget wget http://download.oracle.com/otn/linux/oem/12105/em12105_linux64_disk1.zip?AuthParam=1464336954_bb257a448de89g4997f8a160b32efddd The download should begin now! Post the download is done, if you issue ls -lrt command, you will see the zip file along with some other characters at the end of the file. You can use mv command to rename the file as below # mv oracle_12105_linux64_disk1.zip?AuthParam=1464336954_bb257a448de89g4997f8a160b32efddd oracle_12105_linux64_disk1.zip Related Posts Heading 2 Add paragraph text. Click “Edit Text” to customize this theme across your site. You can update and reuse text themes.

  • Oracle Golden Gate DDL Replication

    Oracle Golden Gate DDL Replication Data Definition Language (DDL) replication is the process of replicating database schema changes, such as creating, dropping, and modifying tables, views, and other objects, from one database to another. In this article, we will discuss the basics of Oracle Golden Gate DDL replication Quick start with DDL Replication Configure Golden Gate DDL Replication Test DDL Replication Quick start with DDL Replication You can enable DDL replication by simply adding below to extract parameter file DDL INCLUDE MAPPED Note: Do not use DDL INCLUDE ALL until it is really required. Now if you add new column on source table, it will reflect on target too ALTER TABLE table_name ADD col_namecol_definition; Configure Golden Gate DDL Replication Create a test table on source for the purpose of testing DDL replication Create table on target with same structure as source On source, add supplemental logging for the table Add the extract Register the extract and edit parameters to add DDL INCLUDE MAPPED Add data pump process Start the extract and pump process Let us add replicat on target Below are the parameters used for rep12 Start replicat process on target Test DDL Replication Let us add a column on the source table Check the extract stats Check the pump stats Check the replicat stats Now check the target table if the new column is added or not Related Posts Heading 2 Add paragraph text. Click “Edit Text” to customize this theme across your site. You can update and reuse text themes.

  • Miscellaneous | DBA Genesis Support

    Miscellaneous DBCA Does Not Display ASM Disk Groups In 12cR2 I was installing Oracle 12c R2 with ASM but somehow the DBCA did not list ASM diskgroups. After struggling for hours, I could find a... Oracle Database Cold Backup & Recovery Oracle Cold Database backup is rarely used these days. DBAs hardly take cold database backup but sometimes it is important to take one... Physical Oracle Database Limits When you try to add data files or resize existing data files, you cannot go with any number in your mind. Oracle database has limitations... Difference between 12cR1 and 12cR2 Multitenant database There are lot of new features introduced by Oracle in 12c Release 2 version when compared to 12c Release 1. Here are some of the... Oracle 11g to 12c Rolling Upgrade A rolling upgrade allows you to perform database upgrade without having any noticeable downtime to the end users. There are multiple ways... Control File and Redolog File Multiplexing Control file and Redolog file contains crucial database information and loss of these files will lead to loss of important data about... Deinstall Oracle Software This article describes how to remove oracle software from a Linux server. There are different methods with which you can remove the... Manual Database Creation It’s always a good ideas to create Oracle database using DBCA. This method of creating Oracle database is outdated but you must also know... Oracle Database Health Check Daily DB Health Checks Below are daily checks which are to be performed by a DBA: Check all instance, listener are up and running SELECT... Oracle Database Health Check Daily DB Health Checks Below are daily checks which are to be performed by a DBA: Check all instance, listener are up and running SELECT... Oracle Table and Tables Cluster In this article we will creating a table cluster inside Oracle. We will also be creating couple of tables inside the cluster table.... Drop all schema objects The below script will drop all the objects owned by a schema. This will not delete the user but only deletes the objects SET SERVEROUTPUT...

  • Script to create HR schema

    Script to create HR schema I was working on a database connected via SQL developer and could not find HR schema. I wanted to create the HR schema using SQL queries. I used below scripts one by one and create the schema with the SYS user Create HR user Create schema objects Insert rows into tables Create index Create procedural objects Add comments to tables & columns Gather schema stats 1. Create HR user Execute below script as sys user SET ECHO OFF SET VERIFY OFF PROMPT PROMPT specify password for HR as parameter 1: DEFINE pass = &1 PROMPT PROMPT specify default tablespeace for HR as parameter 2: DEFINE tbs = &2 PROMPT PROMPT specify temporary tablespace for HR as parameter 3: DEFINE ttbs = &3 PROMPT PROMPT specify password for SYS as parameter 4: DEFINE pass_sys = &4 PROMPT PROMPT specify log path as parameter 5: DEFINE log_path = &5 PROMPT PROMPT specify connect string as parameter 6: DEFINE connect_string = &6 PROMPT -- The first dot in the spool command below is -- the SQL*Plus concatenation character DEFINE spool_file = &log_path.hr_main.log SPOOL &spool_file REM ======================================================= REM cleanup section REM ======================================================= DROP USER hr CASCADE; REM ======================================================= REM create user REM three separate commands, so the create user command REM will succeed regardless of the existence of the REM DEMO and TEMP tablespaces REM ======================================================= CREATE USER hr IDENTIFIED BY &pass; ALTER USER hr DEFAULT TABLESPACE &tbs QUOTA UNLIMITED ON &tbs; ALTER USER hr TEMPORARY TABLESPACE &ttbs; GRANT CREATE SESSION, CREATE VIEW, ALTER SESSION, CREATE SEQUENCE TO hr; GRANT CREATE SYNONYM, CREATE DATABASE LINK, RESOURCE , UNLIMITED TABLESPACE TO hr; GRANT execute ON sys.dbms_stats TO hr; 2. Create schema objects Connect as hr user and execute below script to create HR schema tables conn hr SET FEEDBACK 1 SET NUMWIDTH 10 SET LINESIZE 80 SET TRIMSPOOL ON SET TAB OFF SET PAGESIZE 100 SET ECHO OFF REM ******************************************************************** REM Create the REGIONS table to hold region information for locations REM HR.LOCATIONS table has a foreign key to this table. Prompt ****** Creating REGIONS table .... CREATE TABLE regions ( region_id NUMBER CONSTRAINT region_id_nn NOT NULL , region_name VARCHAR2(25) ); CREATE UNIQUE INDEX reg_id_pk ON regions (region_id); ALTER TABLE regions ADD ( CONSTRAINT reg_id_pk PRIMARY KEY (region_id) ) ; REM ******************************************************************** REM Create the COUNTRIES table to hold country information for customers REM and company locations. REM OE.CUSTOMERS table and HR.LOCATIONS have a foreign key to this table. Prompt ****** Creating COUNTRIES table .... CREATE TABLE countries ( country_id CHAR(2) CONSTRAINT country_id_nn NOT NULL , country_name VARCHAR2(40) , region_id NUMBER , CONSTRAINT country_c_id_pk PRIMARY KEY (country_id) ) ORGANIZATION INDEX; ALTER TABLE countries ADD ( CONSTRAINT countr_reg_fk FOREIGN KEY (region_id) REFERENCES regions(region_id) ) ; REM ******************************************************************** REM Create the LOCATIONS table to hold address information for company departments. REM HR.DEPARTMENTS has a foreign key to this table. Prompt ****** Creating LOCATIONS table .... CREATE TABLE locations ( location_id NUMBER(4) , street_address VARCHAR2(40) , postal_code VARCHAR2(12) , city VARCHAR2(30) CONSTRAINT loc_city_nn NOT NULL , state_province VARCHAR2(25) , country_id CHAR(2) ) ; CREATE UNIQUE INDEX loc_id_pk ON locations (location_id) ; ALTER TABLE locations ADD ( CONSTRAINT loc_id_pk PRIMARY KEY (location_id) , CONSTRAINT loc_c_id_fk FOREIGN KEY (country_id) REFERENCES countries(country_id) ) ; Rem Useful for any subsequent addition of rows to locations table Rem Starts with 3300 CREATE SEQUENCE locations_seq START WITH 3300 INCREMENT BY 100 MAXVALUE 9900 NOCACHE NOCYCLE; REM ******************************************************************** REM Create the DEPARTMENTS table to hold company department information. REM HR.EMPLOYEES and HR.JOB_HISTORY have a foreign key to this table. Prompt ****** Creating DEPARTMENTS table .... CREATE TABLE departments ( department_id NUMBER(4) , department_name VARCHAR2(30) CONSTRAINT dept_name_nn NOT NULL , manager_id NUMBER(6) , location_id NUMBER(4) ) ; CREATE UNIQUE INDEX dept_id_pk ON departments (department_id) ; ALTER TABLE departments ADD ( CONSTRAINT dept_id_pk PRIMARY KEY (department_id) , CONSTRAINT dept_loc_fk FOREIGN KEY (location_id) REFERENCES locations (location_id) ) ; Rem Useful for any subsequent addition of rows to departments table Rem Starts with 280 CREATE SEQUENCE departments_seq START WITH 280 INCREMENT BY 10 MAXVALUE 9990 NOCACHE NOCYCLE; REM ******************************************************************** REM Create the JOBS table to hold the different names of job roles within the company. REM HR.EMPLOYEES has a foreign key to this table. Prompt ****** Creating JOBS table .... CREATE TABLE jobs ( job_id VARCHAR2(10) , job_title VARCHAR2(35) CONSTRAINT job_title_nn NOT NULL , min_salary NUMBER(6) , max_salary NUMBER(6) ) ; CREATE UNIQUE INDEX job_id_pk ON jobs (job_id) ; ALTER TABLE jobs ADD ( CONSTRAINT job_id_pk PRIMARY KEY(job_id) ) ; REM ******************************************************************** REM Create the EMPLOYEES table to hold the employee personnel REM information for the company. REM HR.EMPLOYEES has a self referencing foreign key to this table. Prompt ****** Creating EMPLOYEES table .... CREATE TABLE employees ( employee_id NUMBER(6) , first_name VARCHAR2(20) , last_name VARCHAR2(25) CONSTRAINT emp_last_name_nn NOT NULL , email VARCHAR2(25) CONSTRAINT emp_email_nn NOT NULL , phone_number VARCHAR2(20) , hire_date DATE CONSTRAINT emp_hire_date_nn NOT NULL , job_id VARCHAR2(10) CONSTRAINT emp_job_nn NOT NULL , salary NUMBER(8,2) , commission_pct NUMBER(2,2) , manager_id NUMBER(6) , department_id NUMBER(4) , CONSTRAINT emp_salary_min CHECK (salary > 0) , CONSTRAINT emp_email_uk UNIQUE (email) ) ; CREATE UNIQUE INDEX emp_emp_id_pk ON employees (employee_id) ; ALTER TABLE employees ADD ( CONSTRAINT emp_emp_id_pk PRIMARY KEY (employee_id) , CONSTRAINT emp_dept_fk FOREIGN KEY (department_id) REFERENCES departments , CONSTRAINT emp_job_fk FOREIGN KEY (job_id) REFERENCES jobs (job_id) , CONSTRAINT emp_manager_fk FOREIGN KEY (manager_id) REFERENCES employees ) ; ALTER TABLE departments ADD ( CONSTRAINT dept_mgr_fk FOREIGN KEY (manager_id) REFERENCES employees (employee_id) ) ; Rem Useful for any subsequent addition of rows to employees table Rem Starts with 207 CREATE SEQUENCE employees_seq START WITH 207 INCREMENT BY 1 NOCACHE NOCYCLE; REM ******************************************************************** REM Create the JOB_HISTORY table to hold the history of jobs that REM employees have held in the past. REM HR.JOBS, HR_DEPARTMENTS, and HR.EMPLOYEES have a foreign key to this table. Prompt ****** Creating JOB_HISTORY table .... CREATE TABLE job_history ( employee_id NUMBER(6) CONSTRAINT jhist_employee_nn NOT NULL , start_date DATE CONSTRAINT jhist_start_date_nn NOT NULL , end_date DATE CONSTRAINT jhist_end_date_nn NOT NULL , job_id VARCHAR2(10) CONSTRAINT jhist_job_nn NOT NULL , department_id NUMBER(4) , CONSTRAINT jhist_date_interval CHECK (end_date > start_date) ) ; CREATE UNIQUE INDEX jhist_emp_id_st_date_pk ON job_history (employee_id, start_date) ; ALTER TABLE job_history ADD ( CONSTRAINT jhist_emp_id_st_date_pk PRIMARY KEY (employee_id, start_date) , CONSTRAINT jhist_job_fk FOREIGN KEY (job_id) REFERENCES jobs , CONSTRAINT jhist_emp_fk FOREIGN KEY (employee_id) REFERENCES employees , CONSTRAINT jhist_dept_fk FOREIGN KEY (department_id) REFERENCES departments ) ; REM ******************************************************************** REM Create the EMP_DETAILS_VIEW that joins the employees, jobs, REM departments, jobs, countries, and locations table to provide details REM about employees. Prompt ****** Creating EMP_DETAILS_VIEW view ... CREATE OR REPLACE VIEW emp_details_view (employee_id, job_id, manager_id, department_id, location_id, country_id, first_name, last_name, salary, commission_pct, department_name, job_title, city, state_province, country_name, region_name) AS SELECT e.employee_id, e.job_id, e.manager_id, e.department_id, d.location_id, l.country_id, e.first_name, e.last_name, e.salary, e.commission_pct, d.department_name, j.job_title, l.city, l.state_province, c.country_name, r.region_name FROM employees e, departments d, jobs j, locations l, countries c, regions r WHERE e.department_id = d.department_id AND d.location_id = l.location_id AND l.country_id = c.country_id AND c.region_id = r.region_id AND j.job_id = e.job_id WITH READ ONLY; COMMIT; 3. Insert rows into tables Execute below script to insert rows into tables created in previous step SET VERIFY OFF ALTER SESSION SET NLS_LANGUAGE=American; REM ***************************insert data into the REGIONS table Prompt ****** Populating REGIONS table .... INSERT INTO regions VALUES ( 1 , 'Europe' ); INSERT INTO regions VALUES ( 2 , 'Americas' ); INSERT INTO regions VALUES ( 3 , 'Asia' ); INSERT INTO regions VALUES ( 4 , 'Middle East and Africa' ); REM ***************************insert data into the COUNTRIES table Prompt ****** Populating COUNTIRES table .... INSERT INTO countries VALUES ( 'IT' , 'Italy' , 1 ); INSERT INTO countries VALUES ( 'JP' , 'Japan' , 3 ); INSERT INTO countries VALUES ( 'US' , 'United States of America' , 2 ); INSERT INTO countries VALUES ( 'CA' , 'Canada' , 2 ); INSERT INTO countries VALUES ( 'CN' , 'China' , 3 ); INSERT INTO countries VALUES ( 'IN' , 'India' , 3 ); INSERT INTO countries VALUES ( 'AU' , 'Australia' , 3 ); INSERT INTO countries VALUES ( 'ZW' , 'Zimbabwe' , 4 ); INSERT INTO countries VALUES ( 'SG' , 'Singapore' , 3 ); INSERT INTO countries VALUES ( 'UK' , 'United Kingdom' , 1 ); INSERT INTO countries VALUES ( 'FR' , 'France' , 1 ); INSERT INTO countries VALUES ( 'DE' , 'Germany' , 1 ); INSERT INTO countries VALUES ( 'ZM' , 'Zambia' , 4 ); INSERT INTO countries VALUES ( 'EG' , 'Egypt' , 4 ); INSERT INTO countries VALUES ( 'BR' , 'Brazil' , 2 ); INSERT INTO countries VALUES ( 'CH' , 'Switzerland' , 1 ); INSERT INTO countries VALUES ( 'NL' , 'Netherlands' , 1 ); INSERT INTO countries VALUES ( 'MX' , 'Mexico' , 2 ); INSERT INTO countries VALUES ( 'KW' , 'Kuwait' , 4 ); INSERT INTO countries VALUES ( 'IL' , 'Israel' , 4 ); INSERT INTO countries VALUES ( 'DK' , 'Denmark' , 1 ); INSERT INTO countries VALUES ( 'ML' , 'Malaysia' , 3 ); INSERT INTO countries VALUES ( 'NG' , 'Nigeria' , 4 ); INSERT INTO countries VALUES ( 'AR' , 'Argentina' , 2 ); INSERT INTO countries VALUES ( 'BE' , 'Belgium' , 1 ); REM ***************************insert data into the LOCATIONS table Prompt ****** Populating LOCATIONS table .... INSERT INTO locations VALUES ( 1000 , '1297 Via Cola di Rie' , '00989' , 'Roma' , NULL , 'IT' ); INSERT INTO locations VALUES ( 1100 , '93091 Calle della Testa' , '10934' , 'Venice' , NULL , 'IT' ); INSERT INTO locations VALUES ( 1200 , '2017 Shinjuku-ku' , '1689' , 'Tokyo' , 'Tokyo Prefecture' , 'JP' ); INSERT INTO locations VALUES ( 1300 , '9450 Kamiya-cho' , '6823' , 'Hiroshima' , NULL , 'JP' ); INSERT INTO locations VALUES ( 1400 , '2014 Jabberwocky Rd' , '26192' , 'Southlake' , 'Texas' , 'US' ); INSERT INTO locations VALUES ( 1500 , '2011 Interiors Blvd' , '99236' , 'South San Francisco' , 'California' , 'US' ); INSERT INTO locations VALUES ( 1600 , '2007 Zagora St' , '50090' , 'South Brunswick' , 'New Jersey' , 'US' ); INSERT INTO locations VALUES ( 1700 , '2004 Charade Rd' , '98199' , 'Seattle' , 'Washington' , 'US' ); INSERT INTO locations VALUES ( 1800 , '147 Spadina Ave' , 'M5V 2L7' , 'Toronto' , 'Ontario' , 'CA' ); INSERT INTO locations VALUES ( 1900 , '6092 Boxwood St' , 'YSW 9T2' , 'Whitehorse' , 'Yukon' , 'CA' ); INSERT INTO locations VALUES ( 2000 , '40-5-12 Laogianggen' , '190518' , 'Beijing' , NULL , 'CN' ); INSERT INTO locations VALUES ( 2100 , '1298 Vileparle (E)' , '490231' , 'Bombay' , 'Maharashtra' , 'IN' ); INSERT INTO locations VALUES ( 2200 , '12-98 Victoria Street' , '2901' , 'Sydney' , 'New South Wales' , 'AU' ); INSERT INTO locations VALUES ( 2300 , '198 Clementi North' , '540198' , 'Singapore' , NULL , 'SG' ); INSERT INTO locations VALUES ( 2400 , '8204 Arthur St' , NULL , 'London' , NULL , 'UK' ); INSERT INTO locations VALUES ( 2500 , 'Magdalen Centre, The Oxford Science Park' , 'OX9 9ZB' , 'Oxford' , 'Oxford' , 'UK' ); INSERT INTO locations VALUES ( 2600 , '9702 Chester Road' , '09629850293' , 'Stretford' , 'Manchester' , 'UK' ); INSERT INTO locations VALUES ( 2700 , 'Schwanthalerstr. 7031' , '80925' , 'Munich' , 'Bavaria' , 'DE' ); INSERT INTO locations VALUES ( 2800 , 'Rua Frei Caneca 1360 ' , '01307-002' , 'Sao Paulo' , 'Sao Paulo' , 'BR' ); INSERT INTO locations VALUES ( 2900 , '20 Rue des Corps-Saints' , '1730' , 'Geneva' , 'Geneve' , 'CH' ); INSERT INTO locations VALUES ( 3000 , 'Murtenstrasse 921' , '3095' , 'Bern' , 'BE' , 'CH' ); INSERT INTO locations VALUES ( 3100 , 'Pieter Breughelstraat 837' , '3029SK' , 'Utrecht' , 'Utrecht' , 'NL' ); INSERT INTO locations VALUES ( 3200 , 'Mariano Escobedo 9991' , '11932' , 'Mexico City' , 'Distrito Federal,' , 'MX' ); REM ****************************insert data into the DEPARTMENTS table Prompt ****** Populating DEPARTMENTS table .... REM disable integrity constraint to EMPLOYEES to load data ALTER TABLE departments DISABLE CONSTRAINT dept_mgr_fk; INSERT INTO departments VALUES ( 10 , 'Administration' , 200 , 1700 ); INSERT INTO departments VALUES ( 20 , 'Marketing' , 201 , 1800 ); INSERT INTO departments VALUES ( 30 , 'Purchasing' , 114 , 1700 ); INSERT INTO departments VALUES ( 40 , 'Human Resources' , 203 , 2400 ); INSERT INTO departments VALUES ( 50 , 'Shipping' , 121 , 1500 ); INSERT INTO departments VALUES ( 60 , 'IT' , 103 , 1400 ); INSERT INTO departments VALUES ( 70 , 'Public Relations' , 204 , 2700 ); INSERT INTO departments VALUES ( 80 , 'Sales' , 145 , 2500 ); INSERT INTO departments VALUES ( 90 , 'Executive' , 100 , 1700 ); INSERT INTO departments VALUES ( 100 , 'Finance' , 108 , 1700 ); INSERT INTO departments VALUES ( 110 , 'Accounting' , 205 , 1700 ); INSERT INTO departments VALUES ( 120 , 'Treasury' , NULL , 1700 ); INSERT INTO departments VALUES ( 130 , 'Corporate Tax' , NULL , 1700 ); INSERT INTO departments VALUES ( 140 , 'Control And Credit' , NULL , 1700 ); INSERT INTO departments VALUES ( 150 , 'Shareholder Services' , NULL , 1700 ); INSERT INTO departments VALUES ( 160 , 'Benefits' , NULL , 1700 ); INSERT INTO departments VALUES ( 170 , 'Manufacturing' , NULL , 1700 ); INSERT INTO departments VALUES ( 180 , 'Construction' , NULL , 1700 ); INSERT INTO departments VALUES ( 190 , 'Contracting' , NULL , 1700 ); INSERT INTO departments VALUES ( 200 , 'Operations' , NULL , 1700 ); INSERT INTO departments VALUES ( 210 , 'IT Support' , NULL , 1700 ); INSERT INTO departments VALUES ( 220 , 'NOC' , NULL , 1700 ); INSERT INTO departments VALUES ( 230 , 'IT Helpdesk' , NULL , 1700 ); INSERT INTO departments VALUES ( 240 , 'Government Sales' , NULL , 1700 ); INSERT INTO departments VALUES ( 250 , 'Retail Sales' , NULL , 1700 ); INSERT INTO departments VALUES ( 260 , 'Recruiting' , NULL , 1700 ); INSERT INTO departments VALUES ( 270 , 'Payroll' , NULL , 1700 ); REM ***************************insert data into the JOBS table Prompt ****** Populating JOBS table .... INSERT INTO jobs VALUES ( 'AD_PRES' , 'President' , 20080 , 40000 ); INSERT INTO jobs VALUES ( 'AD_VP' , 'Administration Vice President' , 15000 , 30000 ); INSERT INTO jobs VALUES ( 'AD_ASST' , 'Administration Assistant' , 3000 , 6000 ); INSERT INTO jobs VALUES ( 'FI_MGR' , 'Finance Manager' , 8200 , 16000 ); INSERT INTO jobs VALUES ( 'FI_ACCOUNT' , 'Accountant' , 4200 , 9000 ); INSERT INTO jobs VALUES ( 'AC_MGR' , 'Accounting Manager' , 8200 , 16000 ); INSERT INTO jobs VALUES ( 'AC_ACCOUNT' , 'Public Accountant' , 4200 , 9000 ); INSERT INTO jobs VALUES ( 'SA_MAN' , 'Sales Manager' , 10000 , 20080 ); INSERT INTO jobs VALUES ( 'SA_REP' , 'Sales Representative' , 6000 , 12008 ); INSERT INTO jobs VALUES ( 'PU_MAN' , 'Purchasing Manager' , 8000 , 15000 ); INSERT INTO jobs VALUES ( 'PU_CLERK' , 'Purchasing Clerk' , 2500 , 5500 ); INSERT INTO jobs VALUES ( 'ST_MAN' , 'Stock Manager' , 5500 , 8500 ); INSERT INTO jobs VALUES ( 'ST_CLERK' , 'Stock Clerk' , 2008 , 5000 ); INSERT INTO jobs VALUES ( 'SH_CLERK' , 'Shipping Clerk' , 2500 , 5500 ); INSERT INTO jobs VALUES ( 'IT_PROG' , 'Programmer' , 4000 , 10000 ); INSERT INTO jobs VALUES ( 'MK_MAN' , 'Marketing Manager' , 9000 , 15000 ); INSERT INTO jobs VALUES ( 'MK_REP' , 'Marketing Representative' , 4000 , 9000 ); INSERT INTO jobs VALUES ( 'HR_REP' , 'Human Resources Representative' , 4000 , 9000 ); INSERT INTO jobs VALUES ( 'PR_REP' , 'Public Relations Representative' , 4500 , 10500 ); REM ***************************insert data into the EMPLOYEES table Prompt ****** Populating EMPLOYEES table .... INSERT INTO employees VALUES ( 100 , 'Steven' , 'King' , 'SKING' , '515.123.4567' , TO_DATE('17-06-2003', 'dd-MM-yyyy') , 'AD_PRES' , 24000 , NULL , NULL , 90 ); INSERT INTO employees VALUES ( 101 , 'Neena' , 'Kochhar' , 'NKOCHHAR' , '515.123.4568' , TO_DATE('21-09-2005', 'dd-MM-yyyy') , 'AD_VP' , 17000 , NULL , 100 , 90 ); INSERT INTO employees VALUES ( 102 , 'Lex' , 'De Haan' , 'LDEHAAN' , '515.123.4569' , TO_DATE('13-01-2001', 'dd-MM-yyyy') , 'AD_VP' , 17000 , NULL , 100 , 90 ); INSERT INTO employees VALUES ( 103 , 'Alexander' , 'Hunold' , 'AHUNOLD' , '590.423.4567' , TO_DATE('03-01-2006', 'dd-MM-yyyy') , 'IT_PROG' , 9000 , NULL , 102 , 60 ); INSERT INTO employees VALUES ( 104 , 'Bruce' , 'Ernst' , 'BERNST' , '590.423.4568' , TO_DATE('21-05-2007', 'dd-MM-yyyy') , 'IT_PROG' , 6000 , NULL , 103 , 60 ); INSERT INTO employees VALUES ( 105 , 'David' , 'Austin' , 'DAUSTIN' , '590.423.4569' , TO_DATE('25-06-2005', 'dd-MM-yyyy') , 'IT_PROG' , 4800 , NULL , 103 , 60 ); INSERT INTO employees VALUES ( 106 , 'Valli' , 'Pataballa' , 'VPATABAL' , '590.423.4560' , TO_DATE('05-02-2006', 'dd-MM-yyyy') , 'IT_PROG' , 4800 , NULL , 103 , 60 ); INSERT INTO employees VALUES ( 107 , 'Diana' , 'Lorentz' , 'DLORENTZ' , '590.423.5567' , TO_DATE('07-02-2007', 'dd-MM-yyyy') , 'IT_PROG' , 4200 , NULL , 103 , 60 ); INSERT INTO employees VALUES ( 108 , 'Nancy' , 'Greenberg' , 'NGREENBE' , '515.124.4569' , TO_DATE('17-08-2002', 'dd-MM-yyyy') , 'FI_MGR' , 12008 , NULL , 101 , 100 ); INSERT INTO employees VALUES ( 109 , 'Daniel' , 'Faviet' , 'DFAVIET' , '515.124.4169' , TO_DATE('16-08-2002', 'dd-MM-yyyy') , 'FI_ACCOUNT' , 9000 , NULL , 108 , 100 ); INSERT INTO employees VALUES ( 110 , 'John' , 'Chen' , 'JCHEN' , '515.124.4269' , TO_DATE('28-09-2005', 'dd-MM-yyyy') , 'FI_ACCOUNT' , 8200 , NULL , 108 , 100 ); INSERT INTO employees VALUES ( 111 , 'Ismael' , 'Sciarra' , 'ISCIARRA' , '515.124.4369' , TO_DATE('30-09-2005', 'dd-MM-yyyy') , 'FI_ACCOUNT' , 7700 , NULL , 108 , 100 ); INSERT INTO employees VALUES ( 112 , 'Jose Manuel' , 'Urman' , 'JMURMAN' , '515.124.4469' , TO_DATE('07-03-2006', 'dd-MM-yyyy') , 'FI_ACCOUNT' , 7800 , NULL , 108 , 100 ); INSERT INTO employees VALUES ( 113 , 'Luis' , 'Popp' , 'LPOPP' , '515.124.4567' , TO_DATE('07-12-2007', 'dd-MM-yyyy') , 'FI_ACCOUNT' , 6900 , NULL , 108 , 100 ); INSERT INTO employees VALUES ( 114 , 'Den' , 'Raphaely' , 'DRAPHEAL' , '515.127.4561' , TO_DATE('07-12-2002', 'dd-MM-yyyy') , 'PU_MAN' , 11000 , NULL , 100 , 30 ); INSERT INTO employees VALUES ( 115 , 'Alexander' , 'Khoo' , 'AKHOO' , '515.127.4562' , TO_DATE('18-05-2003', 'dd-MM-yyyy') , 'PU_CLERK' , 3100 , NULL , 114 , 30 ); INSERT INTO employees VALUES ( 116 , 'Shelli' , 'Baida' , 'SBAIDA' , '515.127.4563' , TO_DATE('24-12-2005', 'dd-MM-yyyy') , 'PU_CLERK' , 2900 , NULL , 114 , 30 ); INSERT INTO employees VALUES ( 117 , 'Sigal' , 'Tobias' , 'STOBIAS' , '515.127.4564' , TO_DATE('24-07-2005', 'dd-MM-yyyy') , 'PU_CLERK' , 2800 , NULL , 114 , 30 ); INSERT INTO employees VALUES ( 118 , 'Guy' , 'Himuro' , 'GHIMURO' , '515.127.4565' , TO_DATE('15-11-2006', 'dd-MM-yyyy') , 'PU_CLERK' , 2600 , NULL , 114 , 30 ); INSERT INTO employees VALUES ( 119 , 'Karen' , 'Colmenares' , 'KCOLMENA' , '515.127.4566' , TO_DATE('10-08-2007', 'dd-MM-yyyy') , 'PU_CLERK' , 2500 , NULL , 114 , 30 ); INSERT INTO employees VALUES ( 120 , 'Matthew' , 'Weiss' , 'MWEISS' , '650.123.1234' , TO_DATE('18-07-2004', 'dd-MM-yyyy') , 'ST_MAN' , 8000 , NULL , 100 , 50 ); INSERT INTO employees VALUES ( 121 , 'Adam' , 'Fripp' , 'AFRIPP' , '650.123.2234' , TO_DATE('10-04-2005', 'dd-MM-yyyy') , 'ST_MAN' , 8200 , NULL , 100 , 50 ); INSERT INTO employees VALUES ( 122 , 'Payam' , 'Kaufling' , 'PKAUFLIN' , '650.123.3234' , TO_DATE('01-05-2003', 'dd-MM-yyyy') , 'ST_MAN' , 7900 , NULL , 100 , 50 ); INSERT INTO employees VALUES ( 123 , 'Shanta' , 'Vollman' , 'SVOLLMAN' , '650.123.4234' , TO_DATE('10-10-2005', 'dd-MM-yyyy') , 'ST_MAN' , 6500 , NULL , 100 , 50 ); INSERT INTO employees VALUES ( 124 , 'Kevin' , 'Mourgos' , 'KMOURGOS' , '650.123.5234' , TO_DATE('16-11-2007', 'dd-MM-yyyy') , 'ST_MAN' , 5800 , NULL , 100 , 50 ); INSERT INTO employees VALUES ( 125 , 'Julia' , 'Nayer' , 'JNAYER' , '650.124.1214' , TO_DATE('16-07-2005', 'dd-MM-yyyy') , 'ST_CLERK' , 3200 , NULL , 120 , 50 ); INSERT INTO employees VALUES ( 126 , 'Irene' , 'Mikkilineni' , 'IMIKKILI' , '650.124.1224' , TO_DATE('28-09-2006', 'dd-MM-yyyy') , 'ST_CLERK' , 2700 , NULL , 120 , 50 ); INSERT INTO employees VALUES ( 127 , 'James' , 'Landry' , 'JLANDRY' , '650.124.1334' , TO_DATE('14-01-2007', 'dd-MM-yyyy') , 'ST_CLERK' , 2400 , NULL , 120 , 50 ); INSERT INTO employees VALUES ( 128 , 'Steven' , 'Markle' , 'SMARKLE' , '650.124.1434' , TO_DATE('08-03-2008', 'dd-MM-yyyy') , 'ST_CLERK' , 2200 , NULL , 120 , 50 ); INSERT INTO employees VALUES ( 129 , 'Laura' , 'Bissot' , 'LBISSOT' , '650.124.5234' , TO_DATE('20-08-2005', 'dd-MM-yyyy') , 'ST_CLERK' , 3300 , NULL , 121 , 50 ); INSERT INTO employees VALUES ( 130 , 'Mozhe' , 'Atkinson' , 'MATKINSO' , '650.124.6234' , TO_DATE('30-10-2005', 'dd-MM-yyyy') , 'ST_CLERK' , 2800 , NULL , 121 , 50 ); INSERT INTO employees VALUES ( 131 , 'James' , 'Marlow' , 'JAMRLOW' , '650.124.7234' , TO_DATE('16-02-2005', 'dd-MM-yyyy') , 'ST_CLERK' , 2500 , NULL , 121 , 50 ); INSERT INTO employees VALUES ( 132 , 'TJ' , 'Olson' , 'TJOLSON' , '650.124.8234' , TO_DATE('10-04-2007', 'dd-MM-yyyy') , 'ST_CLERK' , 2100 , NULL , 121 , 50 ); INSERT INTO employees VALUES ( 133 , 'Jason' , 'Mallin' , 'JMALLIN' , '650.127.1934' , TO_DATE('14-06-2004', 'dd-MM-yyyy') , 'ST_CLERK' , 3300 , NULL , 122 , 50 ); INSERT INTO employees VALUES ( 134 , 'Michael' , 'Rogers' , 'MROGERS' , '650.127.1834' , TO_DATE('26-08-2006', 'dd-MM-yyyy') , 'ST_CLERK' , 2900 , NULL , 122 , 50 ); INSERT INTO employees VALUES ( 135 , 'Ki' , 'Gee' , 'KGEE' , '650.127.1734' , TO_DATE('12-12-2007', 'dd-MM-yyyy') , 'ST_CLERK' , 2400 , NULL , 122 , 50 ); INSERT INTO employees VALUES ( 136 , 'Hazel' , 'Philtanker' , 'HPHILTAN' , '650.127.1634' , TO_DATE('06-02-2008', 'dd-MM-yyyy') , 'ST_CLERK' , 2200 , NULL , 122 , 50 ); INSERT INTO employees VALUES ( 137 , 'Renske' , 'Ladwig' , 'RLADWIG' , '650.121.1234' , TO_DATE('14-07-2003', 'dd-MM-yyyy') , 'ST_CLERK' , 3600 , NULL , 123 , 50 ); INSERT INTO employees VALUES ( 138 , 'Stephen' , 'Stiles' , 'SSTILES' , '650.121.2034' , TO_DATE('26-10-2005', 'dd-MM-yyyy') , 'ST_CLERK' , 3200 , NULL , 123 , 50 ); INSERT INTO employees VALUES ( 139 , 'John' , 'Seo' , 'JSEO' , '650.121.2019' , TO_DATE('12-02-2006', 'dd-MM-yyyy') , 'ST_CLERK' , 2700 , NULL , 123 , 50 ); INSERT INTO employees VALUES ( 140 , 'Joshua' , 'Patel' , 'JPATEL' , '650.121.1834' , TO_DATE('06-04-2006', 'dd-MM-yyyy') , 'ST_CLERK' , 2500 , NULL , 123 , 50 ); INSERT INTO employees VALUES ( 141 , 'Trenna' , 'Rajs' , 'TRAJS' , '650.121.8009' , TO_DATE('17-10-2003', 'dd-MM-yyyy') , 'ST_CLERK' , 3500 , NULL , 124 , 50 ); INSERT INTO employees VALUES ( 142 , 'Curtis' , 'Davies' , 'CDAVIES' , '650.121.2994' , TO_DATE('29-01-2005', 'dd-MM-yyyy') , 'ST_CLERK' , 3100 , NULL , 124 , 50 ); INSERT INTO employees VALUES ( 143 , 'Randall' , 'Matos' , 'RMATOS' , '650.121.2874' , TO_DATE('15-03-2006', 'dd-MM-yyyy') , 'ST_CLERK' , 2600 , NULL , 124 , 50 ); INSERT INTO employees VALUES ( 144 , 'Peter' , 'Vargas' , 'PVARGAS' , '650.121.2004' , TO_DATE('09-07-2006', 'dd-MM-yyyy') , 'ST_CLERK' , 2500 , NULL , 124 , 50 ); INSERT INTO employees VALUES ( 145 , 'John' , 'Russell' , 'JRUSSEL' , '011.44.1344.429268' , TO_DATE('01-10-2004', 'dd-MM-yyyy') , 'SA_MAN' , 14000 , .4 , 100 , 80 ); INSERT INTO employees VALUES ( 146 , 'Karen' , 'Partners' , 'KPARTNER' , '011.44.1344.467268' , TO_DATE('05-01-2005', 'dd-MM-yyyy') , 'SA_MAN' , 13500 , .3 , 100 , 80 ); INSERT INTO employees VALUES ( 147 , 'Alberto' , 'Errazuriz' , 'AERRAZUR' , '011.44.1344.429278' , TO_DATE('10-03-2005', 'dd-MM-yyyy') , 'SA_MAN' , 12000 , .3 , 100 , 80 ); INSERT INTO employees VALUES ( 148 , 'Gerald' , 'Cambrault' , 'GCAMBRAU' , '011.44.1344.619268' , TO_DATE('15-10-2007', 'dd-MM-yyyy') , 'SA_MAN' , 11000 , .3 , 100 , 80 ); INSERT INTO employees VALUES ( 149 , 'Eleni' , 'Zlotkey' , 'EZLOTKEY' , '011.44.1344.429018' , TO_DATE('29-01-2008', 'dd-MM-yyyy') , 'SA_MAN' , 10500 , .2 , 100 , 80 ); INSERT INTO employees VALUES ( 150 , 'Peter' , 'Tucker' , 'PTUCKER' , '011.44.1344.129268' , TO_DATE('30-01-2005', 'dd-MM-yyyy') , 'SA_REP' , 10000 , .3 , 145 , 80 ); INSERT INTO employees VALUES ( 151 , 'David' , 'Bernstein' , 'DBERNSTE' , '011.44.1344.345268' , TO_DATE('24-03-2005', 'dd-MM-yyyy') , 'SA_REP' , 9500 , .25 , 145 , 80 ); INSERT INTO employees VALUES ( 152 , 'Peter' , 'Hall' , 'PHALL' , '011.44.1344.478968' , TO_DATE('20-08-2005', 'dd-MM-yyyy') , 'SA_REP' , 9000 , .25 , 145 , 80 ); INSERT INTO employees VALUES ( 153 , 'Christopher' , 'Olsen' , 'COLSEN' , '011.44.1344.498718' , TO_DATE('30-03-2006', 'dd-MM-yyyy') , 'SA_REP' , 8000 , .2 , 145 , 80 ); INSERT INTO employees VALUES ( 154 , 'Nanette' , 'Cambrault' , 'NCAMBRAU' , '011.44.1344.987668' , TO_DATE('09-12-2006', 'dd-MM-yyyy') , 'SA_REP' , 7500 , .2 , 145 , 80 ); INSERT INTO employees VALUES ( 155 , 'Oliver' , 'Tuvault' , 'OTUVAULT' , '011.44.1344.486508' , TO_DATE('23-11-2007', 'dd-MM-yyyy') , 'SA_REP' , 7000 , .15 , 145 , 80 ); INSERT INTO employees VALUES ( 156 , 'Janette' , 'King' , 'JKING' , '011.44.1345.429268' , TO_DATE('30-01-2004', 'dd-MM-yyyy') , 'SA_REP' , 10000 , .35 , 146 , 80 ); INSERT INTO employees VALUES ( 157 , 'Patrick' , 'Sully' , 'PSULLY' , '011.44.1345.929268' , TO_DATE('04-03-2004', 'dd-MM-yyyy') , 'SA_REP' , 9500 , .35 , 146 , 80 ); INSERT INTO employees VALUES ( 158 , 'Allan' , 'McEwen' , 'AMCEWEN' , '011.44.1345.829268' , TO_DATE('01-08-2004', 'dd-MM-yyyy') , 'SA_REP' , 9000 , .35 , 146 , 80 ); INSERT INTO employees VALUES ( 159 , 'Lindsey' , 'Smith' , 'LSMITH' , '011.44.1345.729268' , TO_DATE('10-03-2005', 'dd-MM-yyyy') , 'SA_REP' , 8000 , .3 , 146 , 80 ); INSERT INTO employees VALUES ( 160 , 'Louise' , 'Doran' , 'LDORAN' , '011.44.1345.629268' , TO_DATE('15-12-2005', 'dd-MM-yyyy') , 'SA_REP' , 7500 , .3 , 146 , 80 ); INSERT INTO employees VALUES ( 161 , 'Sarath' , 'Sewall' , 'SSEWALL' , '011.44.1345.529268' , TO_DATE('03-11-2006', 'dd-MM-yyyy') , 'SA_REP' , 7000 , .25 , 146 , 80 ); INSERT INTO employees VALUES ( 162 , 'Clara' , 'Vishney' , 'CVISHNEY' , '011.44.1346.129268' , TO_DATE('11-11-2005', 'dd-MM-yyyy') , 'SA_REP' , 10500 , .25 , 147 , 80 ); INSERT INTO employees VALUES ( 163 , 'Danielle' , 'Greene' , 'DGREENE' , '011.44.1346.229268' , TO_DATE('19-03-2007', 'dd-MM-yyyy') , 'SA_REP' , 9500 , .15 , 147 , 80 ); INSERT INTO employees VALUES ( 164 , 'Mattea' , 'Marvins' , 'MMARVINS' , '011.44.1346.329268' , TO_DATE('24-01-2008', 'dd-MM-yyyy') , 'SA_REP' , 7200 , .10 , 147 , 80 ); INSERT INTO employees VALUES ( 165 , 'David' , 'Lee' , 'DLEE' , '011.44.1346.529268' , TO_DATE('23-02-2008', 'dd-MM-yyyy') , 'SA_REP' , 6800 , .1 , 147 , 80 ); INSERT INTO employees VALUES ( 166 , 'Sundar' , 'Ande' , 'SANDE' , '011.44.1346.629268' , TO_DATE('24-03-2008', 'dd-MM-yyyy') , 'SA_REP' , 6400 , .10 , 147 , 80 ); INSERT INTO employees VALUES ( 167 , 'Amit' , 'Banda' , 'ABANDA' , '011.44.1346.729268' , TO_DATE('21-04-2008', 'dd-MM-yyyy') , 'SA_REP' , 6200 , .10 , 147 , 80 ); INSERT INTO employees VALUES ( 168 , 'Lisa' , 'Ozer' , 'LOZER' , '011.44.1343.929268' , TO_DATE('11-03-2005', 'dd-MM-yyyy') , 'SA_REP' , 11500 , .25 , 148 , 80 ); INSERT INTO employees VALUES ( 169 , 'Harrison' , 'Bloom' , 'HBLOOM' , '011.44.1343.829268' , TO_DATE('23-03-2006', 'dd-MM-yyyy') , 'SA_REP' , 10000 , .20 , 148 , 80 ); INSERT INTO employees VALUES ( 170 , 'Tayler' , 'Fox' , 'TFOX' , '011.44.1343.729268' , TO_DATE('24-01-2006', 'dd-MM-yyyy') , 'SA_REP' , 9600 , .20 , 148 , 80 ); INSERT INTO employees VALUES ( 171 , 'William' , 'Smith' , 'WSMITH' , '011.44.1343.629268' , TO_DATE('23-02-2007', 'dd-MM-yyyy') , 'SA_REP' , 7400 , .15 , 148 , 80 ); INSERT INTO employees VALUES ( 172 , 'Elizabeth' , 'Bates' , 'EBATES' , '011.44.1343.529268' , TO_DATE('24-03-2007', 'dd-MM-yyyy') , 'SA_REP' , 7300 , .15 , 148 , 80 ); INSERT INTO employees VALUES ( 173 , 'Sundita' , 'Kumar' , 'SKUMAR' , '011.44.1343.329268' , TO_DATE('21-04-2008', 'dd-MM-yyyy') , 'SA_REP' , 6100 , .10 , 148 , 80 ); INSERT INTO employees VALUES ( 174 , 'Ellen' , 'Abel' , 'EABEL' , '011.44.1644.429267' , TO_DATE('11-05-2004', 'dd-MM-yyyy') , 'SA_REP' , 11000 , .30 , 149 , 80 ); INSERT INTO employees VALUES ( 175 , 'Alyssa' , 'Hutton' , 'AHUTTON' , '011.44.1644.429266' , TO_DATE('19-03-2005', 'dd-MM-yyyy') , 'SA_REP' , 8800 , .25 , 149 , 80 ); INSERT INTO employees VALUES ( 176 , 'Jonathon' , 'Taylor' , 'JTAYLOR' , '011.44.1644.429265' , TO_DATE('24-03-2006', 'dd-MM-yyyy') , 'SA_REP' , 8600 , .20 , 149 , 80 ); INSERT INTO employees VALUES ( 177 , 'Jack' , 'Livingston' , 'JLIVINGS' , '011.44.1644.429264' , TO_DATE('23-04-2006', 'dd-MM-yyyy') , 'SA_REP' , 8400 , .20 , 149 , 80 ); INSERT INTO employees VALUES ( 178 , 'Kimberely' , 'Grant' , 'KGRANT' , '011.44.1644.429263' , TO_DATE('24-05-2007', 'dd-MM-yyyy') , 'SA_REP' , 7000 , .15 , 149 , NULL ); INSERT INTO employees VALUES ( 179 , 'Charles' , 'Johnson' , 'CJOHNSON' , '011.44.1644.429262' , TO_DATE('04-01-2008', 'dd-MM-yyyy') , 'SA_REP' , 6200 , .10 , 149 , 80 ); INSERT INTO employees VALUES ( 180 , 'Winston' , 'Taylor' , 'WTAYLOR' , '650.507.9876' , TO_DATE('24-01-2006', 'dd-MM-yyyy') , 'SH_CLERK' , 3200 , NULL , 120 , 50 ); INSERT INTO employees VALUES ( 181 , 'Jean' , 'Fleaur' , 'JFLEAUR' , '650.507.9877' , TO_DATE('23-02-2006', 'dd-MM-yyyy') , 'SH_CLERK' , 3100 , NULL , 120 , 50 ); INSERT INTO employees VALUES ( 182 , 'Martha' , 'Sullivan' , 'MSULLIVA' , '650.507.9878' , TO_DATE('21-06-2007', 'dd-MM-yyyy') , 'SH_CLERK' , 2500 , NULL , 120 , 50 ); INSERT INTO employees VALUES ( 183 , 'Girard' , 'Geoni' , 'GGEONI' , '650.507.9879' , TO_DATE('03-02-2008', 'dd-MM-yyyy') , 'SH_CLERK' , 2800 , NULL , 120 , 50 ); INSERT INTO employees VALUES ( 184 , 'Nandita' , 'Sarchand' , 'NSARCHAN' , '650.509.1876' , TO_DATE('27-01-2004', 'dd-MM-yyyy') , 'SH_CLERK' , 4200 , NULL , 121 , 50 ); INSERT INTO employees VALUES ( 185 , 'Alexis' , 'Bull' , 'ABULL' , '650.509.2876' , TO_DATE('20-02-2005', 'dd-MM-yyyy') , 'SH_CLERK' , 4100 , NULL , 121 , 50 ); INSERT INTO employees VALUES ( 186 , 'Julia' , 'Dellinger' , 'JDELLING' , '650.509.3876' , TO_DATE('24-06-2006', 'dd-MM-yyyy') , 'SH_CLERK' , 3400 , NULL , 121 , 50 ); INSERT INTO employees VALUES ( 187 , 'Anthony' , 'Cabrio' , 'ACABRIO' , '650.509.4876' , TO_DATE('07-02-2007', 'dd-MM-yyyy') , 'SH_CLERK' , 3000 , NULL , 121 , 50 ); INSERT INTO employees VALUES ( 188 , 'Kelly' , 'Chung' , 'KCHUNG' , '650.505.1876' , TO_DATE('14-06-2005', 'dd-MM-yyyy') , 'SH_CLERK' , 3800 , NULL , 122 , 50 ); INSERT INTO employees VALUES ( 189 , 'Jennifer' , 'Dilly' , 'JDILLY' , '650.505.2876' , TO_DATE('13-08-2005', 'dd-MM-yyyy') , 'SH_CLERK' , 3600 , NULL , 122 , 50 ); INSERT INTO employees VALUES ( 190 , 'Timothy' , 'Gates' , 'TGATES' , '650.505.3876' , TO_DATE('11-07-2006', 'dd-MM-yyyy') , 'SH_CLERK' , 2900 , NULL , 122 , 50 ); INSERT INTO employees VALUES ( 191 , 'Randall' , 'Perkins' , 'RPERKINS' , '650.505.4876' , TO_DATE('19-12-2007', 'dd-MM-yyyy') , 'SH_CLERK' , 2500 , NULL , 122 , 50 ); INSERT INTO employees VALUES ( 192 , 'Sarah' , 'Bell' , 'SBELL' , '650.501.1876' , TO_DATE('04-02-2004', 'dd-MM-yyyy') , 'SH_CLERK' , 4000 , NULL , 123 , 50 ); INSERT INTO employees VALUES ( 193 , 'Britney' , 'Everett' , 'BEVERETT' , '650.501.2876' , TO_DATE('03-03-2005', 'dd-MM-yyyy') , 'SH_CLERK' , 3900 , NULL , 123 , 50 ); INSERT INTO employees VALUES ( 194 , 'Samuel' , 'McCain' , 'SMCCAIN' , '650.501.3876' , TO_DATE('01-07-2006', 'dd-MM-yyyy') , 'SH_CLERK' , 3200 , NULL , 123 , 50 ); INSERT INTO employees VALUES ( 195 , 'Vance' , 'Jones' , 'VJONES' , '650.501.4876' , TO_DATE('17-03-2007', 'dd-MM-yyyy') , 'SH_CLERK' , 2800 , NULL , 123 , 50 ); INSERT INTO employees VALUES ( 196 , 'Alana' , 'Walsh' , 'AWALSH' , '650.507.9811' , TO_DATE('24-04-2006', 'dd-MM-yyyy') , 'SH_CLERK' , 3100 , NULL , 124 , 50 ); INSERT INTO employees VALUES ( 197 , 'Kevin' , 'Feeney' , 'KFEENEY' , '650.507.9822' , TO_DATE('23-05-2006', 'dd-MM-yyyy') , 'SH_CLERK' , 3000 , NULL , 124 , 50 ); INSERT INTO employees VALUES ( 198 , 'Donald' , 'OConnell' , 'DOCONNEL' , '650.507.9833' , TO_DATE('21-06-2007', 'dd-MM-yyyy') , 'SH_CLERK' , 2600 , NULL , 124 , 50 ); INSERT INTO employees VALUES ( 199 , 'Douglas' , 'Grant' , 'DGRANT' , '650.507.9844' , TO_DATE('13-01-2008', 'dd-MM-yyyy') , 'SH_CLERK' , 2600 , NULL , 124 , 50 ); INSERT INTO employees VALUES ( 200 , 'Jennifer' , 'Whalen' , 'JWHALEN' , '515.123.4444' , TO_DATE('17-09-2003', 'dd-MM-yyyy') , 'AD_ASST' , 4400 , NULL , 101 , 10 ); INSERT INTO employees VALUES ( 201 , 'Michael' , 'Hartstein' , 'MHARTSTE' , '515.123.5555' , TO_DATE('17-02-2004', 'dd-MM-yyyy') , 'MK_MAN' , 13000 , NULL , 100 , 20 ); INSERT INTO employees VALUES ( 202 , 'Pat' , 'Fay' , 'PFAY' , '603.123.6666' , TO_DATE('17-08-2005', 'dd-MM-yyyy') , 'MK_REP' , 6000 , NULL , 201 , 20 ); INSERT INTO employees VALUES ( 203 , 'Susan' , 'Mavris' , 'SMAVRIS' , '515.123.7777' , TO_DATE('07-06-2002', 'dd-MM-yyyy') , 'HR_REP' , 6500 , NULL , 101 , 40 ); INSERT INTO employees VALUES ( 204 , 'Hermann' , 'Baer' , 'HBAER' , '515.123.8888' , TO_DATE('07-06-2002', 'dd-MM-yyyy') , 'PR_REP' , 10000 , NULL , 101 , 70 ); INSERT INTO employees VALUES ( 205 , 'Shelley' , 'Higgins' , 'SHIGGINS' , '515.123.8080' , TO_DATE('07-06-2002', 'dd-MM-yyyy') , 'AC_MGR' , 12008 , NULL , 101 , 110 ); INSERT INTO employees VALUES ( 206 , 'William' , 'Gietz' , 'WGIETZ' , '515.123.8181' , TO_DATE('07-06-2002', 'dd-MM-yyyy') , 'AC_ACCOUNT' , 8300 , NULL , 205 , 110 ); REM ********* insert data into the JOB_HISTORY table Prompt ****** Populating JOB_HISTORY table .... INSERT INTO job_history VALUES (102 , TO_DATE('13-01-2001', 'dd-MM-yyyy') , TO_DATE('24-07-2006', 'dd-MM-yyyy') , 'IT_PROG' , 60); INSERT INTO job_history VALUES (101 , TO_DATE('21-09-1997', 'dd-MM-yyyy') , TO_DATE('27-10-2001', 'dd-MM-yyyy') , 'AC_ACCOUNT' , 110); INSERT INTO job_history VALUES (101 , TO_DATE('28-10-2001', 'dd-MM-yyyy') , TO_DATE('15-03-2005', 'dd-MM-yyyy') , 'AC_MGR' , 110); INSERT INTO job_history VALUES (201 , TO_DATE('17-02-2004', 'dd-MM-yyyy') , TO_DATE('19-12-2007', 'dd-MM-yyyy') , 'MK_REP' , 20); INSERT INTO job_history VALUES (114 , TO_DATE('24-03-2006', 'dd-MM-yyyy') , TO_DATE('31-12-2007', 'dd-MM-yyyy') , 'ST_CLERK' , 50 ); INSERT INTO job_history VALUES (122 , TO_DATE('01-01-2007', 'dd-MM-yyyy') , TO_DATE('31-12-2007', 'dd-MM-yyyy') , 'ST_CLERK' , 50 ); INSERT INTO job_history VALUES (200 , TO_DATE('17-09-1995', 'dd-MM-yyyy') , TO_DATE('17-06-2001', 'dd-MM-yyyy') , 'AD_ASST' , 90 ); INSERT INTO job_history VALUES (176 , TO_DATE('24-03-2006', 'dd-MM-yyyy') , TO_DATE('31-12-2006', 'dd-MM-yyyy') , 'SA_REP' , 80 ); INSERT INTO job_history VALUES (176 , TO_DATE('01-01-2007', 'dd-MM-yyyy') , TO_DATE('31-12-2007', 'dd-MM-yyyy') , 'SA_MAN' , 80 ); INSERT INTO job_history VALUES (200 , TO_DATE('01-07-2002', 'dd-MM-yyyy') , TO_DATE('31-12-2006', 'dd-MM-yyyy') , 'AC_ACCOUNT' , 90 ); REM enable integrity constraint to DEPARTMENTS ALTER TABLE departments ENABLE CONSTRAINT dept_mgr_fk; COMMIT; 4. Create index Run below script to create indexes on HR schema tables SET FEEDBACK 1 SET NUMWIDTH 10 SET LINESIZE 80 SET TRIMSPOOL ON SET TAB OFF SET PAGESIZE 100 SET ECHO OFF CREATE INDEX emp_department_ix ON employees (department_id); CREATE INDEX emp_job_ix ON employees (job_id); CREATE INDEX emp_manager_ix ON employees (manager_id); CREATE INDEX emp_name_ix ON employees (last_name, first_name); CREATE INDEX dept_location_ix ON departments (location_id); CREATE INDEX jhist_job_ix ON job_history (job_id); CREATE INDEX jhist_employee_ix ON job_history (employee_id); CREATE INDEX jhist_department_ix ON job_history (department_id); CREATE INDEX loc_city_ix ON locations (city); CREATE INDEX loc_state_province_ix ON locations (state_province); CREATE INDEX loc_country_ix ON locations (country_id); COMMIT; 5. Create procedural objects Run below scripts to create HR schema procedures SET FEEDBACK 1 SET NUMWIDTH 10 SET LINESIZE 80 SET TRIMSPOOL ON SET TAB OFF SET PAGESIZE 100 SET ECHO OFF REM ************************************************************************** REM procedure and statement trigger to allow dmls during business hours: CREATE OR REPLACE PROCEDURE secure_dml IS BEGIN IF TO_CHAR (SYSDATE, 'HH24:MI') NOT BETWEEN '08:00' AND '18:00' OR TO_CHAR (SYSDATE, 'DY') IN ('SAT', 'SUN') THEN RAISE_APPLICATION_ERROR (-20205, 'You may only make changes during normal office hours'); END IF; END secure_dml; / CREATE OR REPLACE TRIGGER secure_employees BEFORE INSERT OR UPDATE OR DELETE ON employees BEGIN secure_dml; END secure_employees; / ALTER TRIGGER secure_employees DISABLE; REM ************************************************************************** REM procedure to add a row to the JOB_HISTORY table and row trigger REM to call the procedure when data is updated in the job_id or REM department_id columns in the EMPLOYEES table: CREATE OR REPLACE PROCEDURE add_job_history ( p_emp_id job_history.employee_id%type , p_start_date job_history.start_date%type , p_end_date job_history.end_date%type , p_job_id job_history.job_id%type , p_department_id job_history.department_id%type ) IS BEGIN INSERT INTO job_history (employee_id, start_date, end_date, job_id, department_id) VALUES(p_emp_id, p_start_date, p_end_date, p_job_id, p_department_id); END add_job_history; / CREATE OR REPLACE TRIGGER update_job_history AFTER UPDATE OF job_id, department_id ON employees FOR EACH ROW BEGIN add_job_history(:old.employee_id, :old.hire_date, sysdate, :old.job_id, :old.department_id); END; / COMMIT; 6. Add comments to tables & columns SET FEEDBACK 1 SET NUMWIDTH 10 SET LINESIZE 80 SET TRIMSPOOL ON SET TAB OFF SET PAGESIZE 100 SET ECHO OFF COMMENT ON TABLE regions IS 'Regions table that contains region numbers and names. Contains 4 rows; references with the Countries table.'; COMMENT ON COLUMN regions.region_id IS 'Primary key of regions table.'; COMMENT ON COLUMN regions.region_name IS 'Names of regions. Locations are in the countries of these regions.'; COMMENT ON TABLE locations IS 'Locations table that contains specific address of a specific office, warehouse, and/or production site of a company. Does not store addresses / locations of customers. Contains 23 rows; references with the departments and countries tables. '; COMMENT ON COLUMN locations.location_id IS 'Primary key of locations table'; COMMENT ON COLUMN locations.street_address IS 'Street address of an office, warehouse, or production site of a company. Contains building number and street name'; COMMENT ON COLUMN locations.postal_code IS 'Postal code of the location of an office, warehouse, or production site of a company. '; COMMENT ON COLUMN locations.city IS 'A not null column that shows city where an office, warehouse, or production site of a company is located. '; COMMENT ON COLUMN locations.state_province IS 'State or Province where an office, warehouse, or production site of a company is located.'; COMMENT ON COLUMN locations.country_id IS 'Country where an office, warehouse, or production site of a company is located. Foreign key to country_id column of the countries table.'; REM ********************************************* COMMENT ON TABLE departments IS 'Departments table that shows details of departments where employees work. Contains 27 rows; references with locations, employees, and job_history tables.'; COMMENT ON COLUMN departments.department_id IS 'Primary key column of departments table.'; COMMENT ON COLUMN departments.department_name IS 'A not null column that shows name of a department. Administration, Marketing, Purchasing, Human Resources, Shipping, IT, Executive, Public Relations, Sales, Finance, and Accounting. '; COMMENT ON COLUMN departments.manager_id IS 'Manager_id of a department. Foreign key to employee_id column of employees table. The manager_id column of the employee table references this column.'; COMMENT ON COLUMN departments.location_id IS 'Location id where a department is located. Foreign key to location_id column of locations table.'; REM ********************************************* COMMENT ON TABLE job_history IS 'Table that stores job history of the employees. If an employee changes departments within the job or changes jobs within the department, new rows get inserted into this table with old job information of the employee. Contains a complex primary key: employee_id+start_date. Contains 25 rows. References with jobs, employees, and departments tables.'; COMMENT ON COLUMN job_history.employee_id IS 'A not null column in the complex primary key employee_id+start_date. Foreign key to employee_id column of the employee table'; COMMENT ON COLUMN job_history.start_date IS 'A not null column in the complex primary key employee_id+start_date. Must be less than the end_date of the job_history table. (enforced by constraint jhist_date_interval)'; COMMENT ON COLUMN job_history.end_date IS 'Last day of the employee in this job role. A not null column. Must be greater than the start_date of the job_history table. (enforced by constraint jhist_date_interval)'; COMMENT ON COLUMN job_history.job_id IS 'Job role in which the employee worked in the past; foreign key to job_id column in the jobs table. A not null column.'; COMMENT ON COLUMN job_history.department_id IS 'Department id in which the employee worked in the past; foreign key to deparment_id column in the departments table'; REM ********************************************* COMMENT ON TABLE countries IS 'country table. Contains 25 rows. References with locations table.'; COMMENT ON COLUMN countries.country_id IS 'Primary key of countries table.'; COMMENT ON COLUMN countries.country_name IS 'Country name'; COMMENT ON COLUMN countries.region_id IS 'Region ID for the country. Foreign key to region_id column in the departments table.'; REM ********************************************* COMMENT ON TABLE jobs IS 'jobs table with job titles and salary ranges. Contains 19 rows. References with employees and job_history table.'; COMMENT ON COLUMN jobs.job_id IS 'Primary key of jobs table.'; COMMENT ON COLUMN jobs.job_title IS 'A not null column that shows job title, e.g. AD_VP, FI_ACCOUNTANT'; COMMENT ON COLUMN jobs.min_salary IS 'Minimum salary for a job title.'; COMMENT ON COLUMN jobs.max_salary IS 'Maximum salary for a job title'; REM ********************************************* COMMENT ON TABLE employees IS 'employees table. Contains 107 rows. References with departments, jobs, job_history tables. Contains a self reference.'; COMMENT ON COLUMN employees.employee_id IS 'Primary key of employees table.'; COMMENT ON COLUMN employees.first_name IS 'First name of the employee. A not null column.'; COMMENT ON COLUMN employees.last_name IS 'Last name of the employee. A not null column.'; COMMENT ON COLUMN employees.email IS 'Email id of the employee'; COMMENT ON COLUMN employees.phone_number IS 'Phone number of the employee; includes country code and area code'; COMMENT ON COLUMN employees.hire_date IS 'Date when the employee started on this job. A not null column.'; COMMENT ON COLUMN employees.job_id IS 'Current job of the employee; foreign key to job_id column of the jobs table. A not null column.'; COMMENT ON COLUMN employees.salary IS 'Monthly salary of the employee. Must be greater than zero (enforced by constraint emp_salary_min)'; COMMENT ON COLUMN employees.commission_pct IS 'Commission percentage of the employee; Only employees in sales department elgible for commission percentage'; COMMENT ON COLUMN employees.manager_id IS 'Manager id of the employee; has same domain as manager_id in departments table. Foreign key to employee_id column of employees table. (useful for reflexive joins and CONNECT BY query)'; COMMENT ON COLUMN employees.department_id IS 'Department id where employee works; foreign key to department_id column of the departments table'; COMMIT; 7. Gather schema stats Run below query as sys user conn sys as sysdba SET FEEDBACK 1 SET NUMWIDTH 10 SET LINESIZE 80 SET TRIMSPOOL ON SET TAB OFF SET PAGESIZE 100 SET ECHO OFF EXECUTE dbms_stats.gather_schema_stats( - 'HR' , - granularity => 'ALL' , - cascade => TRUE , - block_sample => TRUE ); Related Posts Heading 2 Add paragraph text. Click “Edit Text” to customize this theme across your site. You can update and reuse text themes.

  • Find Session Id Running Specific Query

    Find Session Id Running Specific Query At times DBAs need to find or search for session details that are running a specific query inside database. Example, you might want to find out session ID that is running ALTER TABLE command. Note: This query will give details only if the query is still running inside the database. SET LINES 300 SET PAGES 999 COL SID FOR 99999 COL SER# FOR 9999999 COL OS_ID FOR A5 COL STATUS FOR A8 COL SQL_FULLTEXT FOR A60 SELECT SES.SID, SES.SERIAL# SER#, SES.PROCESS OS_ID, SES.STATUS, SQL.SQL_FULLTEXT FROM V$SESSION SES, V$SQL SQL, V$PROCESS PRC WHERE SES.SQL_ID=SQL.SQL_ID AND SES.SQL_HASH_VALUE=SQL.HASH_VALUE AND SES.PADDR=PRC.ADDR AND UPPER(SQL.SQL_FULLTEXT) LIKE UPPER('ALTER TABLE%SHRINK%'); You can change the last line to search for sessions that are running specific queries. Replace ALTER TABLE%SHRINK% with other command that you want to search. Related Posts Heading 2 Add paragraph text. Click “Edit Text” to customize this theme across your site. You can update and reuse text themes.

Search Results

bottom of page