Most useful Teradata interview questions -- PART - VII
Data Connectors
A Data Connector Operator can function as a file reader or file writer.
It can act as a file reader to read from flat files or an access module, thus becoming a producer of a data stream.
It can also act as a file writer to flat files or an access module, thus becoming consumer of a data stream.
If the file is being read, you must use TYPE DATCONNECTOR PRODUCER
If the file is being written to, then TYPE DATACONNECTOR CONSUMER
You must also put the filename and the format statements in the script.
Related Links
Most useful Teradata interview questions -- PART - I
Most useful Teradata interview questions -- PART - II
Most useful Teradata interview questions -- PART - III
Most useful Teradata interview questions -- PART - IV
Most useful Teradata interview questions -- PART - V
Most useful Teradata interview questions -- PART - VI
Most useful Teradata interview questions -- PART - VII
Most useful Teradata interview questions -- PART - II
Most useful Teradata interview questions -- PART - III
Most useful Teradata interview questions -- PART - IV
Most useful Teradata interview questions -- PART - V
Most useful Teradata interview questions -- PART - VI
Most useful Teradata interview questions -- PART - VII
293.
Update Operator
- The Update Operator is a consumer operator that uses the old MultiLoad protocol to perform Inserts, Updates, Deletes and Upserts.
- Works on up to five tables in a single job.
- Takes advantage of multiple instances and multiple sessions.
- Provides the ability to place conditional logic in the script for applying changes.
- Data blocks are written only once.
- Uses one of the loader slots.
- The Update Operator acts just like the old MultiLoad utility.
- This is a block utility.
- There can be no Unique Secondary Indexes, no triggers, and no referential integrity, or join indexes.
- When there are large volumes of data maintenance, the Update operator is the choice. Generally, when a table is first loaded on Teradata, the (FastLoad) TPT Load operator is used.
- After the initial load, the (MultiLoad) TPT Update protocol is used to maintain the table.
294.Teradata V14.10 Extended MultiLoad Protocol (MLOADX)
- Teradata Parallel Transport (TPT) in Teradata V14.10 can use the new Extended MultiLoad Protocol (MLOADX ) to allow a LOAD script (old MultiLoad) to load to tables with Unique Secondary Indexes, Join Indexes, Triggers, tables with RI, and even Temporal tables.
- When a V14.10 TPT LOAD job runs, the tables are examined and MLOADX is used if needed.
- MLOADX works by using a single NoPI staging table for the fields defined in the import record layout plus the matchtag fields.
- Matchtag fields are used to support the order of MLOADX application.
- An Array INSERT is used to populate the NoPI staging table in the acquisition phase.
- Array INSERT requests run in SQL sessions.
- MERGE-INTO is used to apply data from the staging table to a target table during the application phase.
- DML is rewritten into MERGE-INTO statement during DML phase.
- Errors detected during the application phase are logged into MLOAD UV error table.
- MERGE-INTO handles MultiLoad-style error treatment.
- MultiLoad DELETE uses the new MERGE-INTO DELETE-only operation which is also available for general SQL users.
295.
Stream Operator
- The Steam Operator is a consumer operator that uses SQL protocol to perform near real-time updates to 1 or more tables.
- Uses row-hash locks, allowing concurrent updates on the same table.
- Just like TPump, the Stream Operator performs Inserts, Updates, Deletes, and Upserts on populated table (or empty tables).
- Tables can be set or Multiset tables.
- Just like TPump, the Stream Operator allows users to specify how many updates occur with a pack and rate.
- Stream does not use a loader slot.
- The Stream Operator acts just like the old TPump utility.
- There can be Secondary Indexes, Triggers, Referential Integrity, and Join Indexes.

296.The easiest way to run a TPT script is to use the TBuild utility.




297.In Teradata, what is the significance of UPSERT command?
UPSERT basically stands for Update Else Insert. This option is available only in Teradata.
298.Let us say there is a file that consists of 100 records out of which we need to skip the first and the last 20 records. What will the code snippet?
We need to use BTEQ Utility in order to do this task. Skip 20, as well as Repeat 60 will be used in the script.
299.How do you transfer large amount of data in Teradata?
Transferring of large amount of data can be done by using the various Teradata Utilities i.e. BTEQ, FASTLOAD, MULTILOAD, TPUMP and FASTEXPORT.
- BTEQ (Basic Teradata Query) supports all 4 DMLs: SELECT, INSERT, UPDATE and DELETE.BTEQ also support IMPORT/EXPORT protocols.
- Fastload, MultiLoad and Tpump transfer the data from Host to Teradata.
- FastExport is used to export data from Teradata to the Host.
No comments:
Post a Comment