Feed aggregator

Connecting to MySQL Database Service (MDS) via DBeaver

DBASolved - Mon, 2022-05-16 10:37

With every new service on any cloud platform, the need to make connections is essential .This is the case with […]

The post Connecting to MySQL Database Service (MDS) via DBeaver appeared first on DBASolved.

Categories: DBA Blogs

A quick way of generating Informatica PowerCenter Mappings from a template

Rittman Mead Consulting - Mon, 2022-05-16 04:52
Generating Informatica PowerCenter Content - the Options

In our blogs we have discussed the options for Oracle Data Integrator (ODI) content generation here and here. Our go-to method is to use the ODI Java SDK, which allows querying, manipulating and generating new ODI content.

Can we do the same with Informatica PowerCenter? In the older PC versions there was the Design API that enabled browsing the repository and creating new content. However, I have never used it. My impression is that Oracle APIs are more accessible than Informatica APIs in terms of documentation, help available online and availability for download and tryout.
If we want to browse the PowerCenter repository content, there is an easy way - query the repository database. But what about content generation? Who will be brave or foolish enough to insert records directly into a repository database!? Fortunately, there is a way, and a fairly easy one, if you don't mind doing a bit of Python scripting.

Generate PowerCenter Mappings - an Overview

Selective Informatica PC repository migrations are done via XML export and import - it is easy and mostly fool-proof. If we can generate XMLs for import, then we have found a way of auto-generating PowerCenter content. Informatica seems to support this approach by giving us nice, descriptive error messages if something is wrong with import XMLs. Only completely valid XMLs will import successfully. I have never managed to corrupt my Informatica repository with a dodgy XML import.

Let us look at an example - we need to extract a large number of OLTP tables to a Staging schema. The source and staging tables have very similar structures, except the staging tables have MD5 codes based on all non-key source fields to simplify change data capture (CDC) and also have the extract datetime.

  1. We start by creating a single mapping in Designer, test it, make sure we are 100% happy with it before proceeding further;
  2. We export the mapping in XML format and in the XML file we replace anything unique to the source and target table and their fields with placeholder tags: [[EXAMPLE_TAG]]. (See the XML template example further down.)
  3. Before we generate XMLs for all needed mappings, we need to import Source and Target table definitions from the databases. (We could, if we wanted, generate Source and Target XMLs ourselves but PC Designer allows us to import tables in bulk, which is quicker and easer than generating the XMLs.)
  4. We export all Sources into a single XML file, e.g. sources.xml. Same with all the Targets - they go into targets.xml. (You can select multiple objects and export in a single XML in Repository Manager.) The Source XML file will serve as a driver for our Mapping generation - all Source tables in the sources.xml file will have a Mapping generated for them.
  5. We run a script that iterates through all source tables in the source XML, looks up its target in the targets XML and generates a mapping XML. (See the Python script example further down.) Note that both the Source and Target XML become part of the Mapping XML.
  6. We import the mapping XMLs. If we import manually via the Designer, we still save time in comparison to implementing the mappings in Designer one by one. But we can script the imports, thus getting both the generation and import done in minutes, by creating an XML Control File as described here.
Scripting Informatica PowerCenter Mapping generation

A further improvement to the above would be reusable Session generation. We can generate Sessions in the very same manner as we generate Mappings.

The Implementation

An example XML template for a simple Source-to-Staging mapping that includes Source, Source Qualifier, Expression and Target:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE POWERMART SYSTEM "powrmart.dtd">
<POWERMART CREATION_DATE="05/26/2021 11:55:12" REPOSITORY_VERSION="188.97">
<REPOSITORY NAME="DemoETL" VERSION="188" CODEPAGE="UTF-8" DATABASETYPE="Oracle">
<FOLDER NAME="Extract" GROUP="" OWNER="Developer" SHARED="NOTSHARED" DESCRIPTION="" PERMISSIONS="rwx---r--" UUID="55321111-2222-4929-9fdc-bd0dfw245cd3">

    [[SOURCE]]
	
    [[TARGET]]
	
    <MAPPING DESCRIPTION ="[[MAPPING_DESCRIPTION]]" ISVALID ="YES" NAME ="[[MAPPING_NAME]]" OBJECTVERSION ="1" VERSIONNUMBER ="2">
	
        <TRANSFORMATION DESCRIPTION ="" NAME ="SQ_EXTRACT" OBJECTVERSION ="1" REUSABLE ="NO" TYPE ="Source Qualifier" VERSIONNUMBER ="1">
            [[SQ_TRANSFORMFIELDS]]
            <TABLEATTRIBUTE NAME ="Sql Query" VALUE =""/>
            <TABLEATTRIBUTE NAME ="User Defined Join" VALUE =""/>
            <TABLEATTRIBUTE NAME ="Source Filter" VALUE =""/>
            <TABLEATTRIBUTE NAME ="Number Of Sorted Ports" VALUE ="0"/>
            <TABLEATTRIBUTE NAME ="Tracing Level" VALUE ="Normal"/>
            <TABLEATTRIBUTE NAME ="Select Distinct" VALUE ="NO"/>
            <TABLEATTRIBUTE NAME ="Is Partitionable" VALUE ="NO"/>
            <TABLEATTRIBUTE NAME ="Pre SQL" VALUE =""/>
            <TABLEATTRIBUTE NAME ="Post SQL" VALUE =""/>
            <TABLEATTRIBUTE NAME ="Output is deterministic" VALUE ="NO"/>
            <TABLEATTRIBUTE NAME ="Output is repeatable" VALUE ="Never"/>
        </TRANSFORMATION>
		
        <TRANSFORMATION DESCRIPTION ="" NAME ="EXPTRANS" OBJECTVERSION ="1" REUSABLE ="NO" TYPE ="Expression" VERSIONNUMBER ="2">
            [[EXP_TRANSFORMFIELDS]]
            <TRANSFORMFIELD DATATYPE ="nstring" DEFAULTVALUE ="ERROR(&apos;transformation error&apos;)" DESCRIPTION ="" EXPRESSION ="[[MD5_EXPRESSION]]" EXPRESSIONTYPE ="GENERAL" NAME ="CDC_MD5" PICTURETEXT ="" PORTTYPE ="OUTPUT" PRECISION ="32" SCALE ="0"/>
            <TRANSFORMFIELD DATATYPE ="date/time" DEFAULTVALUE ="ERROR(&apos;transformation error&apos;)" DESCRIPTION ="" EXPRESSION ="SYSTIMESTAMP()" EXPRESSIONTYPE ="GENERAL" NAME ="EXTRACT_DATE" PICTURETEXT ="" PORTTYPE ="OUTPUT" PRECISION ="29" SCALE ="9"/>
            <TABLEATTRIBUTE NAME ="Tracing Level" VALUE ="Normal"/>
        </TRANSFORMATION>

        [[SOURCE_INSTANCE]]
		
        <INSTANCE DESCRIPTION ="" NAME ="SQ_EXTRACT" REUSABLE ="NO" TRANSFORMATION_NAME ="SQ_EXTRACT" TRANSFORMATION_TYPE ="Source Qualifier" TYPE ="TRANSFORMATION">
            <ASSOCIATED_SOURCE_INSTANCE NAME ="[[SOURCE_INSTANCE_NAME]]"/>
        </INSTANCE>
		
        <INSTANCE DESCRIPTION ="" NAME ="EXPTRANS" REUSABLE ="NO" TRANSFORMATION_NAME ="EXPTRANS" TRANSFORMATION_TYPE ="Expression" TYPE ="TRANSFORMATION"/>
		
        [[TARGET_INSTANCE]]

        [[SRC_2_SQ_CONNECTORS]]

        [[SQ_2_EXP_CONNECTORS]]

        [[EXP_2_TGT_CONNECTORS]]

        <CONNECTOR FROMFIELD ="CDC_MD5" FROMINSTANCE ="EXPTRANS" FROMINSTANCETYPE ="Expression" TOFIELD ="CDC_MD5" TOINSTANCE ="[[TARGET_INSTANCE_NAME]]" TOINSTANCETYPE ="Target Definition"/>
        <CONNECTOR FROMFIELD ="EXTRACT_DATE" FROMINSTANCE ="EXPTRANS" FROMINSTANCETYPE ="Expression" TOFIELD ="EXTRACT_DATE" TOINSTANCE ="[[TARGET_INSTANCE_NAME]]" TOINSTANCETYPE ="Target Definition"/>

        <TARGETLOADORDER ORDER ="1" TARGETINSTANCE ="[[TARGET_INSTANCE_NAME]]"/>

        <ERPINFO/>
        <METADATAEXTENSION COMPONENTVERSION ="1000000" DATATYPE ="STRING" DESCRIPTION ="" DOMAINNAME ="User Defined Metadata Domain" ISCLIENTEDITABLE ="YES" ISCLIENTVISIBLE ="YES" ISREUSABLE ="YES" ISSHAREREAD ="NO" ISSHAREWRITE ="NO" MAXLENGTH ="256" NAME ="Extension" VALUE ="" VENDORNAME ="INFORMATICA"/>
    </MAPPING>
</FOLDER>
</REPOSITORY>
</POWERMART>

Python script snippets for generating Mapping XMLs based on the above template:

  1. To translate database types to Informatica data types:
mapDataTypeDict = {
	"nvarchar": "nstring",
	"date": "date/time",
	"timestamp": "date/time",
	"number": "decimal",
	"bit": "nstring"
}

2. Set up a dictionary of tags:

xmlReplacer = {
	"[[SOURCE]]": "",
	"[[TARGET]]": "",
	"[[MAPPING_DESCRIPTION]]": "",
	"[[MAPPING_NAME]]": "",
	"[[SQ_TRANSFORMFIELDS]]": "",
	"[[EXP_TRANSFORMFIELDS]]": "",
	"[[MD5_EXPRESSION]]": "",
	"[[SOURCE_INSTANCE]]": "",
	"[[SOURCE_INSTANCE_NAME]]": "",
	"[[TARGET_INSTANCE]]": "",
	"[[TARGET_INSTANCE_NAME]]": "",
	"[[SRC_2_SQ_CONNECTORS]]": "",
	"[[SQ_2_EXP_CONNECTORS]]": "",
	"[[EXP_2_TGT_CONNECTORS]]": ""
}

3. We use the Source tables we extracted in a single XML file as our driver for Mapping creation:

sourceXmlFilePath = '.\\sources.xml'

# go down the XML tree to individual Sources
sourceTree = ET.parse(sourceXmlFilePath)
sourcePowerMart = sourceTree.getroot()
sourceRepository = list(sourcePowerMart)[0]
sourceFolder = list(sourceRepository)[0]

for xmlSource in sourceFolder:
	# generate a Mapping for each Source
    
    # We also need to go down the Field level:    
    for sourceField in xmlSource:
    	# field level operations

4. Generate tag values. This particular example is of a Column-level tag, a column connector between Source Qualifier and Expression:

sqToExpConnectorTag = f'<CONNECTOR FROMFIELD ="{columnName}" FROMINSTANCE ="SQ_EXTRACT" FROMINSTANCETYPE ="Source Qualifier" TOFIELD ="{columnName}" TOINSTANCE ="EXPTRANS" TOINSTANCETYPE ="Expression"/>'

5. We assign our tag values to the tag dictionary entries:

xmlReplacer["[[SQ_2_EXP_CONNECTORS]]"] = '\n'.join(sqToExpConnectors)

6. We replace the tags in the XML Template with the values from the dictionary:

for replaceTag in xmlReplacer.keys():
	mappingXml = mappingXml.replace(replaceTag, xmlReplacer[replaceTag])

Interested in finding out more about our approach to generating Informatica content, contact us.

Categories: BI & Warehousing

Data Annotation with SVG and JavaScript

Andrejus Baranovski - Mon, 2022-05-16 01:35
I explain how to build a simple data annotation tool with SVG and JavaScript in HTML page. The sample code renders two boxes in SVG on top of the receipt image. You will learn how to select and switch between annotation boxes. Enjoy!

 

Maximum number of concurrent sessions in multi instance database

Tom Kyte - Sun, 2022-05-15 23:46
Hi, We have Oracle 12C on 2 instances. I know GV$license can give maximum number of concurrent sessions since start of instances. But is there a way to get maximum we had accessing the database from both together ? Syed
Categories: DBA Blogs

Index on XMLTYPE with XPATH Expression including a XPATH Function

Tom Kyte - Sun, 2022-05-15 23:46
Is there a way to create a index for a xpath that is including a xpath function? Please consider that xmltype index creation fails at oracle livesql.
Categories: DBA Blogs

Cannot Upload git-upload-pack error while cloning Azure Git Repository

Tom Kyte - Sun, 2022-05-15 23:46
Hi, <i>Background and Requirement</i> - I am working for a firm that uses <b>Oracle SQL Developer</b> for Data Cleaning and Manipulation of the data residing in the Oracle Database. We use <b>Microsoft Azure</b> for complete lifecycle management and work planning. So, we decided to use an <b>Azure-hosted cloud Git Repository</b> to host our code remotely and leverage its version control capabilities. We have a Git repository on Azure and are trying to clone the same in Oracle SQL Developer. <i>Steps followed to fulfill the requirement</i> - The following steps were followed for cloning the existing remote repository in Oracle SQL Developer. 1. Go to Teams Menu. 2. Hover over Git. 3. Select Clone option. 4. After the Clone from Git wizard opens up, entered the correct Repository URL, Username and password. 5. We work on a VPN so, I have set the corresponding proxy settings too. When testing the proxy, it gives a success message. (So, no issue in the proxy settings) 6. Click next to fetch remote repository branches. An error appears at this stage. <i>Error that occured</i> - A popup with the title <b>Validation failed</b> and the content as https://<remote repo url>/_git/<remote repo name>:cannot open git-upload-pack appears. <i>Troubleshooting Methods Tried</i> - The following troubleshooting methods have been tried. 1. A lot of troubleshooting methods online suggested that the Local git config has sslVerify set to false could help. Did that, no gain. 2. Tried cloning my personal git repository to test the working of the Git integration on Oracle SQL Developer. It was able to successfully fetch the remote branches. Hence, the error is coming up only while cloning an Azure Repository. 3. Looked at almost all the solution links online, but most of them were for Eclipse. Since both Eclipse and SQL Developer are Java-based applications, I tried doing those resolutions but most of them are regarding SSL Verify setting to false. At the end I have raised the issue here. Hoping to find some help here. Thanks in advance.
Categories: DBA Blogs

音楽専門学校におけるインターン制度で就職

Marian Crkon - Sun, 2022-05-15 00:14
音楽専門学校に通学する方々の動機というのは人それぞれ大きく異なります。プロになりたいと考えている方が居る一方で...

音楽専門学校におけるインターン制度で就職

The Feature - Sun, 2022-05-15 00:14

音楽専門学校に通学する方々の動機というのは人それぞれ大きく異なります。プロになりたいと考えている方が居る一方で、業界に入り何らかの立場で働きたいと考えている方も居ます。数ある進路の中で音楽専門学校という選択肢を選ぶのは、独学には存在しないメリットがあるためです。音楽に携わってお仕事をしていくという場合は、どのような形態であっても様々なスキルや知識を持ち合わせている事は必須です。

専門性が高い上に、感性も大きく関係するためインターネットや書物を用いても知識を得る事は困難で、指導員から直接的に学ぶ事に越したことはありません。歌手や楽器を演奏するといった事は、特に感性により左右されるので座学でプロの腕前になる事は不可能です。そして、指導員の多くは普段スタジオなどで録音作業に携わっているプロのミュージシャンです。いわゆる二足のわらじを履いている状態なので、指導員を介する事で業界に足を踏み入れるきっかけができるという可能性は十分にあるのです。

しかも、音楽専門学校としても就職率を高めたり優秀な人材を業界に投入させるという意味合いから、業界と学生の接点を数多く用意しているのです。また、音楽専門学校は就職希望者を中心に一定期間提携している企業で働くという制度を導入している事もあります。刻々と変化する音楽の業界で活動することにより、学校では学べない事を学べます。卒業後は、インターン制度を利用した企業にそのまま入社するというケースも珍しくありません。

Categories: APPS Blogs

FORCE_LOGGING in Autonomous Database

Tom Kyte - Fri, 2022-05-13 16:46
Is FORCE_LOGGING enabled at CDB level in ADB-S? I checked that FORCE_LOGGING was not enabled at the PDB level and the Tablespace level.
Categories: DBA Blogs

Find Circular References in UDTs

Tom Kyte - Fri, 2022-05-13 16:46
The latest Oracle docs has the following design tip: 9.13.5.2 Circular Dependencies Among Types Avoid creating circular dependencies among types. In other words, do not create situations in which a method of type T returns a type T1, which has a method that returns a type T. https://docs.oracle.com/en/database/oracle/oracle-database/21/adobj/design-consideration-tips-and-techniques.html Attached is a link to LiveSQL that exhibits a very simple circular dependency that will likely have issues recompiling during a datapump. Assuming we already have a large application that the compiler is having issues with is there a query we can use to find instances where T1 references T2 and T2 references T1? We would also need to find them a few generations apart (T1 references T2, T2 references T3, T3 references T1). The reference may be either in an attribute (REF) or a subprogram (parameter or return type). This would allow us to find what types may need to be changed to be brought in line with the latest documentation. Thanks in advance for your help.
Categories: DBA Blogs

Select XMLQuery XML parsing error with ampersands

Tom Kyte - Fri, 2022-05-13 16:46
Hi Tom and Team, I guess that this issue is related to the namespace, but as I don't know well this, Could you help me to solve the error running this Select, please? <code>with testTable as ( select xmltype ('<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"> <soap:Body> <ns5:MT_Consulta_pedidos_pagamento xmlns:ns2="urn:Cpy.com/Model/ConsultaPedidosDevolucao/v0" xmlns:ns3="urn:Cpy.com/Model/AtualizaStatusPagamento/v0" xmlns:ns4="urn:Cpy.com/Model/AtualizaItensDevolvidosCancelados/v0" xmlns:ns5="urn:Cpy.com/Model/ConsultaPedidosPagamento/v0"> <codigo_empresa>&Empresa</codigo_empresa> <numero_pedido_venda>&Pedido</numero_pedido_venda> <codigo_loja>&Loja</codigo_loja> <numero_componente>&Componente</numero_componente> </ns5:MT_Consulta_pedidos_pagamento> </soap:Body> </soap:Envelope>' ) xml_val from dual ) select xmlquery('/soap' passing xml_val returning content) as dados from testTable;</code>
Categories: DBA Blogs

Patch Oracle GoldenGate Microservices using RESTful APIs

DBASolved - Fri, 2022-05-13 08:10

In 2017, Oracle introduced the world to Oracle GoldenGate Microservices through the release of Oracle GoldenGate 12c (12.3.0.0.1). Upon the […]

The post Patch Oracle GoldenGate Microservices using RESTful APIs appeared first on DBASolved.

Categories: DBA Blogs

How can we execute a SQL script file in SQL trigger and output of this SQL script execution into the log file?

Tom Kyte - Thu, 2022-05-12 22:26
How can we execute a SQL script file in SQL trigger and output of this SQL script execution into the log file? We are automating one of the SQL script file execution. We want to execute this SQL script file once the data will insert into the table and we want the SQL script file execution in the trigger. Regards, Abhishek Bhargava
Categories: DBA Blogs

PLSQL nested procedure hides resolution of an outer procedure

Tom Kyte - Thu, 2022-05-12 22:26
<code>declare type t1 is record ( f1 number ); type t2 is record ( f1 number ); v1 t1; v2 t2; procedure q(p1 in t1) is begin null; end q; procedure p(p1 in t1, p2 in t2) is procedure q(p2 in t2) is begin null; end q; begin q(p1); q(p2); end p; begin p(v1, v2); end; /</code> Procedure p has a nested procedure with the same name of an outer procedure (q). PLSQL cannot resolve the call to q, raising the error PLS-00306: wrong number or types of arguments in call to 'Q'. If I move the nested procedure in an outer scope, the block runs ok: <code>declare type t1 is record ( f1 number ); type t2 is record ( f1 number ); v1 t1; v2 t2; procedure q(p1 in t1) is begin null; end q; procedure q(p2 in t2) is begin null; end q; procedure p(p1 in t1, p2 in t2) is begin q(p1); q(p2); end p; begin p(v1, v2); end; /</code> It seems that the local procedure q(t2) hides the outer q(t1), even if they have different signatures. Are there any reasons for that behaviour? Thanks Eddy
Categories: DBA Blogs

External table in a PL/SQL procedure

Tom Kyte - Thu, 2022-05-12 22:26
Hi Tom ? My task: move several dozen text file imports from SQLLDR (on AIX) into callable PL/SQL procedures. The text files are static in structure with daily refreshes of the contents. The contents are loaded into individual tables in our 19c EE database. The solution appeared to be external tables, so I created a proof-of-concept example that worked as expected as stand-alone code. So far, so good: <code>SELECT * FROM all_directories WHERE directory_name = 'CONNECT2'; -- returns /connect2. CREATE TABLE MY_EXT_TBL ( CUSIP VARCHAR2(25 BYTE), DESCRIPTION VARCHAR2(200 BYTE), QTY NUMBER(18,5), ACCOUNT VARCHAR2(100 BYTE) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY CONNECT2 ACCESS PARAMETERS ( RECORDS DELIMITED BY NEWLINE BADFILE CONNECT2:'MY_EXT_TBL%a_%p.bad' LOGFILE CONNECT2: 'MY_EXT_TBL%a_%p.log' DISCARDFILE CONNECT2: 'MY_EXT_TBL%a_%p.discard' FIELDS TERMINATED BY '|' MISSING FIELD VALUES ARE NULL ( CUSIP, DESCRIPTION, QTY, ACCOUNT ) ) LOCATION ('exttabletestfile.txt') ) REJECT LIMIT UNLIMITED; -- Table MY_EXT_TBL created. SELECT COUNT(*) FROM MY_EXT_TBL; -- Returns 65159. Matches file row count. </code> It was when I attempted to move the working code into a procedure that things went sour. This example shows a very basic (no Log, Bad, or Discard files) example and hints at the hazards of going that route. I accepted that challenge, but after trying every combination of single and double quotes around file names without success, I am stumped. This feels harder than it should be. If External Tables in a sproc are a valid, if tricky, solution, could you please demonstrate a working example? Or should I be using UTL_File instead? Or something else? Best regards, Dexter
Categories: DBA Blogs

Configure of Oracle Data Miner repository in SQL Developer Desktop to work with Autonomous Database

Tom Kyte - Thu, 2022-05-12 04:06
I was looking at this article https://blogs.oracle.com/machinelearning/post/oracle-data-miner-now-available-for-autonomous-database. Is Data Miner also supported on ADW? If so, I am looking for a tutorial to setup Oracle Data Miner to use with ADW. In particular, I am struggling with the setup of the data miner connection / user with SYS privileges to install the Data Miner Repository. I am using SQL developer 21.4.3.063 on MacOs.
Categories: DBA Blogs

個性的な人材が音楽専門学校に不向きな理由

Marian Crkon - Thu, 2022-05-12 00:13
音楽が好きで自分で演奏する機会も多いという方は、必ず音楽専門学校に進学すべきか迷うタイミングがあります。答えを...

個性的な人材が音楽専門学校に不向きな理由

The Feature - Thu, 2022-05-12 00:13

音楽が好きで自分で演奏する機会も多いという方は、必ず音楽専門学校に進学すべきか迷うタイミングがあります。答えを導き出す事が難しい悩みではあるものの、参考にする事ができる一つの情報があります。それは、プロの中でも特にレベルの高い人材を目指すのであれば、音楽専門学校に進学すべきではないということです。極めてレベルが高い人材を目指すつもりではなく、音楽に携わる事ができるお仕事に従事する事ができれば良いとい考えの場合は進学すべきでしょう。

このような判断基準が設けられているのは、音楽というものは感性が大きく関係している分野であるためです。感性が大きく左右する世界だからこそ、大勢の人々と共に同一の事柄を学んでいると技術力と知識が平均的なもので収まってしまうのです。人とは異なる個性を有していたとしても、個性は消失してしまう可能性が高いです。つまり、大勢の人々とは明らかに異なる魅力的な個性を持っている方は音楽専門学校に通うよりも、荷物片手に世界を歩き回る方がより一層魅力を増幅させることができます。

音楽業界を見渡してみると理解することができるように、第一線で活躍することができている方というのは、強い個性を有している方々ばかりで、音楽専門学校に通っていたという経歴を持つ方は多くありません。ただし、現在人よりも秀でている点が無いものの将来音楽に関与するお仕事を行いたいという考えを持っているのなら音楽専門学校に行くべきでしょう。

Categories: APPS Blogs

table with 900 million records with 2 clob fields and weighing 5tera and without indexes

Tom Kyte - Wed, 2022-05-11 09:46
Greetings oracle DB gurus, On this subject I want a recommendation, the database weighs 7 teras in total but 5 of that 7 teras is only the audit table, that table only has 3 years of data (The business needs to keep all the data) and it has more than 900 million record and 2 clob fields, it is a move table, We have had several incidents related to this table, slowness in the Database for inserting that table, as it has clob fields that sometimes save 10 million characters, not if that is related, apart from that we have run out of disk space, tablespace or data file, the log is filling up very fast, it doesn't even let the alerts arrive before the disk is full, for example. This table is used by several applications at the same time and saves all the activities that users perform, the clob fields are the details of the activities The business wants to pull reports from this table when that table only has one index. here I leave the structure of the table CREATE TABLE EBTDEV.ADMIN_AUDIT ( ID NUMBER NOT NULL , EVENT_TYPE NUMBER(1, 0) , OWNER_ID NUMBER , OWNER VARCHAR2(100 BYTE) , OWNER_PERMISSIONS CLOB , EVENT_DESCRIPTION VARCHAR2(200 BYTE) , OBJECT_TYPE VARCHAR2(100BYTE) , OBJECT_ID NUMBER , BEFORE CLOB , AFTER CLOB , TERMINAL VARCHAR2(100 BYTE) , EVENT_DATE TIMESTAMP(6) , AGENCY VARCHAR2(10 BYTE) ,PORTAL VARCHAR2(20 BYTE) , UPD_FILE_DW TIMESTAMP(6) ) and this is the only index it has CREATE INDEX EBTDEV.IX_EVENT_DT_UPD_FILE_DW ON EBTDEV.ADMIN_AUDIT (EVENT_DATE ASC, UPD_FILE_DW ASC) my question is what is your recommendation to improve performance regarding the creation of reports and optimize the table so as not to have more issues of DB space and slowness in the DB
Categories: DBA Blogs

Pages

Subscribe to Oracle FAQ aggregator