YouTube Icon

Interview Questions.

Top 100+ Etl Testing Interview Questions And Answers - May 30, 2020

fluid

Top 100+ Etl Testing Interview Questions And Answers

Question 1. What Is Etl?

Answer :

ETL stands for extraction, transformation and loading.

ETL provide developers with an interface for designing supply-to-target mappings, transformation and activity manipulate parameter.

Extraction :

Take records from an external supply and move it to the warehouse pre-processor database.

Transformation:

Transform statistics challenge allows factor-to-factor producing, modifying and transforming information.

Loading:

Load records challenge adds statistics to a database table in a warehouse.

Question 2. What Is The Difference Between Etl Tool And Olap Tools?

Answer :

ETL device is meant for extraction statistics from the legacy structures and load into precise database with some process of cleaning information.

Ex: Informatica, statistics level ....And so on

OLAP is supposed for Reporting motive in OLAP information available in Multidirectional version. So you can write simple question to extract information from the information base.

Ex: Business gadgets, Cognos....And so on

Data Warehouse ETL Toolkit Interview Questions
Question three. Can We Lookup A Table From Source Qualifier Transformation. Ie. Unconnected Lookup?

Answer :

You can not research from a source qualifier immediately. However, you may override the SQL inside the supply qualifier to enroll in with the research table to perform the lookup.

Question 4. What Is Ods (operation Data Source)?

Answer :

ODS - Operational Data Store.
ODS Comes between staging location & Data Warehouse. The facts is ODS may be on the low stage of granularity.
Once facts became populated in ODS aggregated records can be loaded into EDW through ODS.
Data Warehouse ETL Toolkit Tutorial
Question five. Where Do We Use Connected And Unconnected Lookups?

Answer :

If return port most effective one then we are able to go for unconnected. More than one return port isn't always feasible with Unconnected. If multiple return port then move for Connected.
If you require dynamic cache i.E where your facts will alternate dynamically then you may move for related research. If your facts is static where your facts won't alternate whilst the session loads you can go for unconnected lookups .
Manual Testing Interview Questions
Question 6. Where Do We Use Semi And Non Additive Facts?

Answer :

Additive: A degree can take part arithmetic calculations using all or any dimensions.

Ex: Sales income
Semi additive: A degree can participate mathematics calculations using some dimensions.

Ex: Sales amount
Non Additive:A measure can't take part arithmetic calculations the use of dimensions.

Ex: temperature

Question 7. What Are Non-additive Facts In Detail?

Answer :

A truth may be measure, metric or a dollar price. Measure and metric are non additive facts.
Dollar value is additive fact. If we need to find out the amount for a selected vicinity for a specific time period, we can upload the dollar quantities and provide you with the whole quantity.
A non additive reality, for eg; degree top(s) for 'citizens by geographical location' , while we rollup 'city' information to 'kingdom' stage statistics we have to no longer upload heights of the citizens instead we may need to apply it to derive 'count number'.
Database Testing Tutorial SQL Interview Questions
Question 8. What Is A Staging Area? Do We Need It? What Is The Purpose Of A Staging Area?

Answer :

Data staging is actually a group of strategies used to put together source system facts for loading a facts warehouse. Staging consists of the following steps:

Source records extraction, Data transformation (restructuring),
Data transformation (records cleaning, fee transformations),
Surrogate key assignments.
Question nine. What Is Latest Version Of Power Center / Power Mart?

Answer :

The Latest Version is 7.2

Share Point Administration Interview Questions
Question 10. What Are The Modules In Power Mart?

Answer :

 PowerMart Designer
 Server
 Server Manager
 Repository
 Repository Manager
Unix/Linux Tutorial
Question eleven. What Are Active Transformation / Passive Transformations?

Answer :

Active transformation can exchange the variety of rows that bypass via it. (Decrease or increase rows)
Passive transformation can't change the quantity of rows that bypass thru it.
Database Testing Interview Questions
Question 12. What Are The Different Lookup Methods Used In Informatica?

Answer :

Connected research:

Connected research will acquire input from the pipeline and sends output to the pipeline and may go back any variety of values it does now not contain go back port.

Unconnected research:

Unconnected lookup can return simplest one column it comprise go back port.

Data Warehouse ETL Toolkit Interview Questions
Question 13. Can Informatica Load Heterogeneous Targets From Heterogeneous Sources?

Answer :

No, In Informatica five.2 and
Yes, in Informatica 6.1 and later.
Question 14. How Do We Call Shell Scripts From Informatica?

Answer :

Specify the Full course of the Shell script the "Post session homes of consultation/workflow".

Question 15. What Is Informatica Metadata And Where Is It Stored?

Answer :

Informatica Metadata is statistics approximately statistics which stores in Informatica repositories.

Automation Testing Interview Questions
Question 16. What Is A Mapping, Session, Worklet, Workflow, Mapplet?

Answer :

A mapping represents dataflow from sources to goals.
A mapplet creates or configures a hard and fast of alterations.
A workflow is a set of commands that inform the Informatica server a way to execute the duties.
A worklet is an item that represents a set of tasks.
A session is a fixed of instructions that describe how and when to move facts from resources to targets.
Question 17. What Are Parameter Files? Where Do We Use Them?

Answer :

Parameter document defines the fee for parameter and variable utilized in a workflow, worklet or session.

Rational Functional Tester Interview Questions
Question 18. Can We Override A Native Sql Query Within Informatica? Where Do We Do It? How Do We Do It?

Answer :

Yes, we will override a native square question in supply qualifier and research transformation.
In lookup transformation we will locate "Sql override" in lookup homes. By the use of this option we are able to try this.

Manual Testing Interview Questions
Question 19. Can We Use Procedural Logic Inside Infromatica? If Yes How , If No How Can We Use External Procedural Logic In Informatica?

Answer :

Yes, you could use superior external transformation, You can use c++ language on unix and c++, vb vc++ on home windows server.

Question 20. Do We Need An Etl Tool? When Do We Go For The Tools In The Market?

Answer :

ETL Tool:

It is used to Extract(E) data from more than one supply structures(like RDBMS, Flat documents, Mainframes, SAP, XML and so forth) rework(T) then based on Business necessities and Load(L) in goal places.(like tables, documents etc).
Need of ETL Tool:

An ETL device is typically required when information scattered throughout unique structures.(like RDBMS, Flat documents, Mainframes, SAP, XML and so on).

Embedded Testing Interview Questions
Question 21. How To Determine What Records To Extract?

Answer :

When addressing a table a few dimension key need to mirror the need for a report to get extracted. Mostly it will likely be from time size (e.G. Date >= 1st of contemporary month) or a transaction flag (e.G. Order Invoiced Stat). Foolproof might be adding an archive flag to record which receives reset while report changes.

Question 22. What Is Full Load & Incremental Or Refresh Load?

Answer :

Full Load: absolutely erasing the contents of 1 or greater tables and reloading with clean information.
Incremental Load: applying ongoing changes to one or extra tables based on a predefined schedule.

Question 23. When Do We Analyze The Tables? How Do We Do It?

Answer :

The ANALYZE announcement allows you to validate and compute facts for an index, desk, or cluster. These facts are utilized by the fee-based optimizer when it calculates the most green plan for retrieval. In addition to its position in assertion optimization, ANALYZE additionally helps in validating item structures and in managing space in your gadget. You can pick the subsequent operations: COMPUTER, ESTIMATE, and DELETE. Early version of Oracle7 produced unpredictable outcomes when the ESTIMATE operation became used. It is nice to compute your facts.

EX:
pick out OWNER,
sum(decode(nvl(NUM_ROWS,9999), 9999,zero,1)) analyzed,
sum(decode(nvl(NUM_ROWS,9999), 9999,1,0)) not_analyzed,
remember(TABLE_NAME) total
from dba_tables
where OWNER now not in ('SYS', 'SYSTEM')
organization by OWNER
Unix/Linux Interview Questions
Question 24. Compare Etl & Manual Development?

Answer :

These are some variations b/w guide and ETL improvement.
ETL

The procedure of extracting statistics from more than one resources.(ex. Flatfiles, XML, COBOL, SAP and so on) is extra simpler with the help of equipment.
High and clear visibility of common sense.
Contains Meta facts and modifications can be carried out effortlessly.
Error managing, log summary and cargo progress makes life less difficult for developer and maintainer.
Can deal with Historic information very well.
Manual

Loading the information apart from flat files and oracle table need more effort.
Complex and no longer so person pleasant visibility of logic.
No Meta information concept and adjustments desires greater effort.
Want maximum effort from renovation factor of view.
As facts grows the processing time degrades.
SQL Interview Questions
Question 25. What Is Real Time Data-wearhousing?

Answer :

In real time records-warehousing, the warehouse is updated on every occasion the device plays a transaction.
It reflects the actual time business records.
This approach that when the question is fired inside the warehouse, the state of the business at that time may be back.
Question 26. Explain The Use Lookup Tables And Aggregate Tables?

Answer :

An mixture table carries summarized view of facts.
Lookup tables, the usage of the primary key of the goal, permit updating of records based totally at the lookup circumstance.
Radar Test Engineer Interview Questions
Question 27. Define Slowly Changing Dimensions (scd)?

Answer :

SCD are dimensions whose data changes very slowly.
Eg: city or an employee.

This measurement will alternate very slowly.
The row of this records within the dimension can be both replaced completely with none tune of antique file OR a new row can be inserted, OR the trade may be tracked.
Share Point Administration Interview Questions
Question 28. What Is Cube Grouping?

Answer :

A transformer built set of comparable cubes is known as dice grouping. They are commonly used in growing smaller cubes which might be primarily based on the statistics within the stage of size.

Question 29. What Is Data Wearhousing?

Answer :

A facts warehouse can be taken into consideration as a storage region in which applicable data is stored irrespective of the supply.
Data warehousing merges facts from multiple sources into an smooth and whole shape.
Question 30. What Is Virtual Data Wearhousing?

Answer :

A digital data warehouse provides a collective view of the completed facts. It can be considered as a logical statistics version of the containing metadata.

Question 31. What Is Active Data Wearhousing?

Answer :

An lively facts warehouse represents a unmarried kingdom of the business. It considers the analytic views of customers and suppliers. It facilitates to deliver the updated statistics via reviews.

Question 32. What Is Data Modeling And Data Mining?

Answer :

Data Modeling is a technique used to define and examine the requirements of data that supports agency’s enterprise process. In easy phrases, it's far used for the evaluation of information items so one can identify the relationships amongst these information objects in any commercial enterprise.
Data Mining is a way used to investigate datasets to derive beneficial insights/information. It is specifically utilized in retail, patron items, telecommunication and financial organizations which have a sturdy customer orientation for you to determine the impact on income, patron delight and profitability.

Question 33. What Are Critical Success Factors?

Answer :

Key regions of pastime wherein favorable consequences are vital for a enterprise to achieve its goal.
There are four fundamental sorts of CSFs that are:

Industry CSFs
Strategy CSFs
Environmental CSFs
Temporal CSFs
Database Testing Interview Questions
Question 34. What Is Data Cube Technology Used For?

Answer :

Data cubes are typically used for easy interpretation of facts. It is used to symbolize facts at the side of dimensions as some measures of business needs. Each dimension of the dice represents a few characteristic of the database. E.G income in keeping with day, month or year.

Question 35. What Is Data Cleaning?

Answer :

Data cleansing is likewise known as data scrubbing.
Data cleansing is a method which guarantees the set of facts is correct and correct. Data accuracy and consistency, facts integration is checked at some point of data cleansing. Data cleansing can be implemented for a hard and fast of facts or multiple sets of information which need to be merged.
Question 36. Explain How To Mine An Olap Cube?

Answer :

An extension of facts mining may be used for slicing the facts the supply cube in determined statistics mining.
The case table is dimensioned at the time of mining a dice.

Automation Testing Interview Questions
Question 37. What Are Different Stages Of Data Mining?

Answer :

A stage of records mining is a logical method for looking big quantity statistics for locating important statistics.
Stage 1: Exploration: One will need to discover and prepare facts. The goal of the exploration degree is to find critical variables and decide their nature.
Stage 2: pattern identity: Searching for patterns and deciding on the only which permits making best prediction, is the primary movement on this degree.
Stage 3: Deployment degree: Until steady sample is located in degree 2, that's particularly predictive, this stage cannot be reached. The pattern determined in degree 2, can be implemented for the cause to look whether or not the desired outcome is finished or no longer.

Question 38. What Are The Different Problems That Data Mining Can Slove?

Answer :

Data mining may be utilized in a ramification of fields/industries like advertising of products and services, AI, authorities intelligence.
The US FBI makes use of data mining for screening safety and intelligence for identifying unlawful and incriminating e-information disbursed over net.

Question 39. What Is Data Purging?

Answer :

Deleting statistics from information warehouse is known as statistics purging. Usually junk statistics like rows with null values or spaces are cleaned up.
Data purging is the process of cleansing this kind of junk values.

Question 40. What Is Bus Schema?

Answer :

A BUS schema is to discover the commonplace dimensions across enterprise procedures, like identifying conforming dimensions. It has conformed dimension and standardized definition of statistics.

Rational Functional Tester Interview Questions
Question forty one. Define Non-additive Facts?

Answer :

Non additive data are records that can't be summed up for any dimensions present in reality desk. These columns can not be brought for generating any consequences.

Question forty two. What Is Conformed Fact? What Are Conformed Dimensions Used For?

Answer :

Conformed fact in a warehouse permits itself to have identical call in separate tables. They may be in comparison and combined mathematically. Conformed dimensions may be used across multiple facts marts. They have a static structure. Any measurement desk that is utilized by more than one truth tables may be conformed dimensions.

Embedded Testing Interview Questions
Question 43. What Is A Three Tier Data Warehouse?

Answer :

A statistics warehouse can be thought of as a three-tier gadget in which a center gadget gives usable facts in a comfy way to stop users. On either facet of this center system are the end customers and the returned-cease information stores.

Question 44. What Is The Metadata Extension?

Answer :

Informatica allows stop customers and companions to increase the metadata saved within the repository by using associating records with individual items in the repository. For instance, when you create a mapping, you could keep your contact facts with the mapping. You partner facts with repository metadata the usage of metadata extensions.
Informatica Client applications can include the following forms of metadata extensions:
Vendor-described. Third-birthday celebration software carriers create dealer-described metadata extensions. You can view and exchange the values of seller-defined metadata extensions, but you can't create, delete, or redefine them.
User-defined. You create user-defined metadata extensions the usage of PowerCenter/PowerMart. You can create, edit, delete, and consider person-described metadata extensions. You can also change the values of person-defined extensions.
Question 45. How Can We Use Mapping Variables In Informatica? Where Do We Use Them?

Answer :

Yes. We are able to use mapping variable in Informatica.
The Informatica server saves the cost of mapping variable to the repository at the give up of session run and makes use of that fee next time we run the consultation.

Question forty six. What Is Etl Process ?How Many Steps Etl Contains Explain With Example?

Answer :

ETL is extraction, transforming, loading process, you will extract records from the source and observe the commercial enterprise function on it then you'll load it within the goal the stairs are :

define the supply(create the odbc and the connection to the supply DB)
outline the goal (create the odbc and the relationship to the goal DB)
create the mapping ( you may follow the enterprise role here by means of adding adjustments , and outline how the facts drift will go from the source to the target )
create the session (its a set of training that run the mapping )
create the paintings drift (education that run the session)
Question 47. Give Some Popular Tools?

Answer :

Popular Tools:

IBM Web Sphere Information Integration (Accentual DataStage)
Ab Initio
Informatica
Talend
Question 48. Give Some Etl Tool Functionalities?

Answer :

While the selection of a database and a hardware platform is a must, the selection of an ETL device is tremendously advocated, but it is now not a should. When you evaluate ETL gear, it pays to search for the subsequent traits:

Functional functionality: This includes both the 'transformation' piece and the 'cleansing' piece. In standard, the typical ETL gear are both geared towards having strong transformation abilities or having robust cleansing abilities, but they are seldom very strong in each. As a result, in case you recognise your information goes to be dirty coming in, make sure your ETL device has sturdy cleaning talents. If  there are going to be a variety of distinctive information alterations, it then makes sense to pick out a tool that is strong in transformation.
Ability to examine directly out of your records source: For every corporation, there may be a different set of records assets. Make positive the ETL tool you pick out can connect directly on your source data.
Metadata assist: The ETL tool performs a key role in your metadata as it maps the supply facts to the destination, which is an vital piece of the metadata. In fact, some groups have come to rely on the documentation in their ETL device as their metadata source. As a result, it's miles very critical to choose an ETL tool that works together with your ordinary metadata strategy.
Question 49. How Can I Edit The Xml Target, Are There Anyways Apart From The Editing The Xsd File. Can I Directly Edit The Xml Directly In Informatica Designer?

Answer :

No you can't edit it from Informatica designer. But nonetheless you may change the precession of the ports if xml source is imported from DTD document.

Question 50. What Is The Difference Between Informatica 7.Zero&8.Zero?

Answer :

The only difference b/w informatica 7 & 8 is... 8 is a SOA (Service Oriented Architecture) while 7 isn't. SOA in informatica is treated thru extraordinary grid designed in server.

Question fifty one. Where Do We Use Connected And Un Connected Lookups?

Answer :

If go back port handiest one then we will cross for unconnected. More than one return port isn't viable with Unconnected. If multiple return port then move for Connected.

Question 52. What Are The Various Tools? - Name A Few.

Answer :

 Abinitio
 DataStage
 Informatica
 Cognos Decision Stream
 Oracle Warehouse Builder
 Business Objects XI (Extreme Insight)
 SAP Business Warehouse
 SAS Enterprise ETL Server
Question 53. What Are The Various Test Procedures Used To Check Whether The Data Is Loaded In The Backend, Performance Of The Mapping, And Quality Of The Data Loaded In Informatica?

Answer :

The nice manner to take a assist of debugger where we display every and each procedure of mappings and the way facts is loading primarily based on conditions breaks.

Question fifty four. What Is The Difference Between Joiner And Lookup?

Answer :

joiner is used to join  or more tables to retrieve statistics from tables(much like joins in sq.).
Look up is used to check and compare source table and target table .(similar to correlated sub-question in sq.).

Question 55. If A Flat File Contains one thousand Records How Can I Get First And Last Records Only?

Answer :

By the usage of Aggregator transformation with first and closing functions we will get first and final record.

Question fifty six. How Do You Calculate Fact Table Granularity?

Answer :

Granularity, is the extent of detail wherein the fact desk is describing, as an instance if we are making time analysis so the granularity perhaps day based totally - month based totally or yr based totally.

Question 57. What Are The Different Versions Of Informatica?

Answer :

Here are a few famous versions of Informatica.

Informatica Powercenter 4.1,
Informatica Powercenter five.1,
Powercenter Informatica 6.1.2,
Informatica Powercenter 7.1.2,
Informatica Powercenter 8.1,
Informatica Powercenter 8.Five,
Informatica Powercenter 8.6.
Question 58. Techniques Of Error Handling - Ignore , Rejecting Bad Records To A Flat File , Loading The Records And Reviewing Them (default Values)?

Answer :

Rejection of records either on the database due to constraint key violation or the informatica server while writing facts into target desk. These rejected information we are able to locate inside the horrific documents folder where a reject record can be created for a session. We can take a look at why a document has been rejected. And this horrific report includes first column a row indicator and 2d column a column indicator.
These row indicators are of four types:
D-legitimate facts,
O-overflowed statistics,
N-null records,
T- Truncated records,
And depending on these indicators we will modifications to load facts efficiently to goal.

Question 59. What Is The Difference Between Power Center & Power Mart?

Answer :

PowerCenter - ability to organize repositories right into a statistics mart domain and share metadata across repositories.
PowerMart - handiest local repository may be created.

Question 60. What Are Snapshots? What Are Materialized Views & Where Do We Use Them? What Is A Materialized View Log?

Answer :

Snapshots are study-simplest copies of a master table located on a far off node which is periodically refreshed to mirror modifications made to the grasp desk. Snapshots are replicate or replicas of tables.
Views are built the usage of the columns from one or more tables. The Single Table View can be updated however the view with multi desk can not be updated.
A View can be updated/deleted/inserted if it has only one base desk if the view is based on columns from one or extra tables then insert, replace and delete isn't possible.
Materialized view
A pre-computed table comprising aggregated or joined information from fact and possibly size tables. Also called a precis or aggregate desk.




CFG